Filter interviews by
I applied via Recruitment Consulltant and was interviewed in Jan 2022. There were 2 interview rounds.
Optimizers exist in FFNs to update weights, but many exist due to different optimization techniques and trade-offs.
Different optimizers use different optimization techniques such as momentum, adaptive learning rates, and regularization.
Optimizers have different trade-offs such as convergence speed, generalization, and robustness to noisy data.
The choice of optimizer depends on the specific problem and data set.
Examples...
Layer normalisation is a technique used to normalise the inputs to each layer of a feedforward neural network.
It is similar to batch normalisation but normalises the inputs to each layer instead of the entire batch.
It helps in reducing the internal covariate shift problem.
It can be applied to any type of activation function.
It is particularly useful in recurrent neural networks.
Example: LayerNorm in PyTorch.
Build an airline system to recommend routes from city 1 to city 2 with direct and connecting flights.
Create a graph with cities as nodes and connections as edges
Use Dijkstra's algorithm to find shortest path between city 1 and city 2
For connecting flights, find all possible paths with one or more stops
Sort and recommend routes based on total distance and number of stops
This function takes a string input and returns all possible combinations of characters in that string without repeated letters.
Use the itertools module to generate all possible permutations of the string.
Filter out the permutations that have repeated letters using a set.
Convert the filtered permutations into a list of strings.
Top trending discussions
Software Engineer
29
salaries
| ₹4 L/yr - ₹13.8 L/yr |
Engineer
7
salaries
| ₹4 L/yr - ₹6 L/yr |
Implementation Lead
5
salaries
| ₹5 L/yr - ₹11.3 L/yr |
Project Manager
5
salaries
| ₹8.8 L/yr - ₹18.2 L/yr |
Software Developer
5
salaries
| ₹3.6 L/yr - ₹5 L/yr |
Uniphore Software Systems
Jio Haptik
yellow.ai
Engati