Filter interviews by
I applied via Referral and was interviewed in Apr 2023. There were 2 interview rounds.
I applied via Approached by Company and was interviewed in May 2023. There were 3 interview rounds.
Sigmoid is used for binary classification while softmax is used for multi-class classification.
Sigmoid function outputs values between 0 and 1, suitable for binary classification tasks.
Softmax function outputs a probability distribution over multiple classes, summing up to 1.
Sigmoid is used in the output layer for binary classification, while softmax is used for multi-class classification.
Softmax is the generalization
An activation function is a mathematical function that determines the output of a neural network.
Activation functions introduce non-linearity to the neural network, allowing it to learn complex patterns in the data.
Common activation functions include sigmoid, tanh, ReLU, and softmax.
The choice of activation function can impact the performance and training speed of the neural network.
Build an NLP model on their dataset
Data Analyst
16
salaries
| ₹2.4 L/yr - ₹9.7 L/yr |
Business Analyst
8
salaries
| ₹3.9 L/yr - ₹9 L/yr |
Graphic Designer
7
salaries
| ₹2.2 L/yr - ₹5.4 L/yr |
Software Developer
6
salaries
| ₹5 L/yr - ₹10.3 L/yr |
E-Commerce Executive
5
salaries
| ₹1.7 L/yr - ₹3.6 L/yr |
Bluestone.com
Caratlane Trading
Tanishq
Malabar Gold and Diamonds