Adagrad AI
National Highways & Infrastructure Development Corporation\ Interview Questions and Answers
Q1. Rotate an image in your choice of language
To rotate an image in Python, use the Pillow library's rotate() method.
Import the Image module from the Pillow library
Open the image using the open() method
Use the rotate() method to rotate the image by the desired angle
Save the rotated image using the save() method
Q2. What is the difference between CNN and RNN
CNN is used for image recognition while RNN is used for sequence data like text or speech.
CNN is Convolutional Neural Network, used for image recognition tasks.
RNN is Recurrent Neural Network, used for sequence data like text or speech.
CNN has convolutional layers for feature extraction, while RNN has recurrent connections for sequential data processing.
CNN is good at capturing spatial dependencies in data, while RNN is good at capturing temporal dependencies.
Example: CNN can...read more
Q3. Why are activation functions used
Activation functions are used to introduce non-linearity into neural networks, allowing them to learn complex patterns and relationships.
Activation functions help neural networks to learn complex patterns and relationships by introducing non-linearity.
They help in controlling the output of a neuron, ensuring that it falls within a desired range.
Common activation functions include ReLU, Sigmoid, Tanh, and Leaky ReLU.
Without activation functions, neural networks would simply be...read more
Q4. Is Logloss function differentiable
Yes, Logloss function is differentiable.
Logloss function is differentiable as it is a smooth and continuous function.
The derivative of Logloss function can be calculated using calculus.
Differentiability is important for optimization algorithms like gradient descent to converge smoothly.
Example: The derivative of Logloss function for binary classification is (predicted probability - actual label).
Q5. What is learning rate
Learning rate is a hyperparameter that controls how much we are adjusting the weights of our network with respect to the loss gradient.
Learning rate determines the size of the steps taken during optimization.
A high learning rate can cause the model to overshoot the optimal weights, while a low learning rate can result in slow convergence.
Common learning rate values are 0.1, 0.01, 0.001, etc.
Learning rate can be adjusted during training using techniques like learning rate sche...read more
Q6. A breadth first search problem
Breadth first search problem
BFS is a graph traversal algorithm that visits all the vertices of a graph in breadth-first order
It uses a queue to keep track of the nodes to be visited next
BFS can be used to find the shortest path between two nodes in an unweighted graph
Top Interview Questions from Similar Companies
Reviews
Interviews
Salaries
Users/Month