i
Amdocs
Filter interviews by
Fiber engineering involves the design, development, and application of fiber materials for various industries.
Focuses on creating synthetic and natural fibers for textiles, composites, and other applications.
Involves understanding fiber properties like tensile strength, elasticity, and thermal resistance.
Examples include developing carbon fibers for aerospace or biodegradable fibers for sustainable textiles.
Incorp...
A JUnit test case verifies the functionality of a Java method using assertions to check expected outcomes.
Use @Test annotation to define a test method.
Utilize assertions like assertEquals, assertTrue, etc., to validate results.
Example: @Test public void testAddition() { assertEquals(5, add(2, 3)); }
Use @Before to set up any necessary preconditions before tests.
Use @After to clean up resources after tests.
Spark is a distributed computing framework that processes big data in memory and is known for its speed and ease of use.
Spark is a distributed computing framework that can process data in memory for faster processing.
It uses Resilient Distributed Datasets (RDDs) for fault-tolerant distributed data processing.
Spark provides high-level APIs in Java, Scala, Python, and R for ease of use.
It supports various data sourc...
Big data refers to large and complex data sets that are difficult to process using traditional data processing applications.
Big data involves large volumes of data
It includes data from various sources such as social media, sensors, and business transactions
Big data requires specialized tools and technologies for processing and analysis
What people are saying about Amdocs
Remote backend in Terraform allows state management in a centralized location, enabling collaboration and consistency.
Enables multiple team members to work on the same infrastructure without conflicts.
Common remote backends include AWS S3, Azure Blob Storage, and HashiCorp Consul.
Example: Using AWS S3 as a backend requires specifying the bucket and key in the Terraform configuration.
State locking can be implemente...
Terraform automates infrastructure provisioning, ensuring consistency and scalability in our DevOps workflows.
Infrastructure as Code (IaC): We define our infrastructure using Terraform configuration files, enabling version control and collaboration.
Environment Management: Terraform allows us to create and manage multiple environments (dev, staging, production) with ease.
Resource Provisioning: We use Terraform to p...
Various tools in Excel for data analysis and manipulation.
Pivot tables for summarizing and analyzing data
VLOOKUP and HLOOKUP for searching and retrieving specific information
Conditional formatting for highlighting important data
Data validation for controlling input values
Charts and graphs for visualizing data trends
Creditworthiness can be calculated by assessing an individual's financial history, income, debt-to-income ratio, and credit score.
Evaluate the individual's credit score, which is a numerical representation of their creditworthiness based on their credit history.
Assess the individual's income to debt ratio to determine their ability to repay debts.
Review the individual's financial history, including any past bankru...
Data analytics is the process of analyzing raw data to draw conclusions and make informed decisions.
Data analytics involves collecting, processing, and analyzing data to identify trends and patterns.
It helps organizations make data-driven decisions and improve business performance.
Examples of data analytics tools include Tableau, Power BI, and Google Analytics.
Basic coding in Java involves understanding syntax, data types, control structures, and object-oriented principles.
Java is a statically typed language, meaning variable types must be declared. Example: 'int number = 5;'
Control structures include if-else statements and loops. Example: 'for (int i = 0; i < 10; i++) { System.out.println(i); }'
Java supports object-oriented programming. Example: 'class Dog { void ba...
I appeared for an interview in Aug 2024.
I applied via Campus Placement
Big data refers to large and complex data sets that are difficult to process using traditional data processing applications.
Big data involves large volumes of data
It includes data from various sources such as social media, sensors, and business transactions
Big data requires specialized tools and technologies for processing and analysis
Spark is a distributed computing framework that processes big data in memory and is known for its speed and ease of use.
Spark is a distributed computing framework that can process data in memory for faster processing.
It uses Resilient Distributed Datasets (RDDs) for fault-tolerant distributed data processing.
Spark provides high-level APIs in Java, Scala, Python, and R for ease of use.
It supports various data sources li...
Our application is a data engineering platform that processes and analyzes large volumes of data to provide valuable insights.
Our application uses various data processing techniques such as ETL (Extract, Transform, Load) to clean and transform raw data into usable formats.
We utilize big data technologies like Hadoop, Spark, and Kafka to handle large datasets efficiently.
The application also includes machine learning al...
I applied via Referral and was interviewed in Apr 2024. There were 4 interview rounds.
I applied via Campus Placement and was interviewed in Jul 2024. There were 2 interview rounds.
I applied via Campus Placement and was interviewed in May 2024. There were 2 interview rounds.
I am a dedicated Technical Support Analyst with 5 years of experience in troubleshooting hardware and software issues.
5 years of experience in technical support
Skilled in troubleshooting hardware and software issues
Strong communication and problem-solving skills
I have a strong technical background, excellent problem-solving skills, and a proven track record of providing top-notch customer support.
I have a Bachelor's degree in Computer Science and 3 years of experience in technical support roles.
I am proficient in troubleshooting hardware and software issues, and have a knack for finding solutions quickly.
I have received positive feedback from previous employers and customers ...
I applied via Company Website
A code to identify errors and write Junit. Optimise the code as well.
Implementing a stack using two queues
Use two queues to simulate a stack
Push operation: Enqueue the element to queue 1
Pop operation: Dequeue all elements from queue 1 to queue 2, dequeue the last element from queue 1, then swap the queues
Top operation: Return the front element of queue 1
Example: Push 1, 2, 3 - Queue 1: [1, 2, 3], Queue 2: []
Example: Pop - Queue 1: [1, 2], Queue 2: [3]
I appeared for an interview in Jan 2025, where I was asked the following questions.
Fiber engineering involves the design, development, and application of fiber materials for various industries.
Focuses on creating synthetic and natural fibers for textiles, composites, and other applications.
Involves understanding fiber properties like tensile strength, elasticity, and thermal resistance.
Examples include developing carbon fibers for aerospace or biodegradable fibers for sustainable textiles.
Incorporate...
I'm motivated to join this company due to its innovative approach and commitment to sustainability in fiber engineering.
The company's focus on cutting-edge technology aligns with my passion for innovation, as seen in my previous projects on smart textiles.
I admire the company's commitment to sustainability, which resonates with my values and my work on eco-friendly fiber solutions.
The collaborative culture here excites...
I applied via Company Website and was interviewed before Nov 2023. There were 4 interview rounds.
I am a dedicated Compensation and Benefits Analyst with a strong background in HR and a passion for ensuring fair and competitive compensation packages for employees.
I have a Bachelor's degree in Human Resources Management
I have 5 years of experience in analyzing compensation and benefits programs
I am proficient in conducting market research and benchmarking studies
I have a strong understanding of labor laws and regula...
Excel Assessment Logical and Technical
Various tools in Excel for data analysis and manipulation.
Pivot tables for summarizing and analyzing data
VLOOKUP and HLOOKUP for searching and retrieving specific information
Conditional formatting for highlighting important data
Data validation for controlling input values
Charts and graphs for visualizing data trends
Data analytics is the process of analyzing raw data to draw conclusions and make informed decisions.
Data analytics involves collecting, processing, and analyzing data to identify trends and patterns.
It helps organizations make data-driven decisions and improve business performance.
Examples of data analytics tools include Tableau, Power BI, and Google Analytics.
Creditworthiness can be calculated by assessing an individual's financial history, income, debt-to-income ratio, and credit score.
Evaluate the individual's credit score, which is a numerical representation of their creditworthiness based on their credit history.
Assess the individual's income to debt ratio to determine their ability to repay debts.
Review the individual's financial history, including any past bankruptcie...
The duration of Amdocs interview process can vary, but typically it takes about less than 2 weeks to complete.
based on 43 interview experiences
Difficulty level
Duration
based on 4.3k reviews
Rating in categories
Software Developer
8.5k
salaries
| ₹9 L/yr - ₹15.5 L/yr |
Software Engineer
2k
salaries
| ₹6.8 L/yr - ₹16.2 L/yr |
Softwaretest Engineer
1.8k
salaries
| ₹5.8 L/yr - ₹13.8 L/yr |
Functional Test Engineer
1.2k
salaries
| ₹5 L/yr - ₹12.2 L/yr |
Associate Software Engineer
946
salaries
| ₹4.8 L/yr - ₹10 L/yr |
TCS
IBM
Oracle
Carelon Global Solutions