Add office photos
Employer?
Claim Account for FREE
Kaggle
-
No reviews yet
About Kaggle
Founded in--
India Employee Count--
Global Employee Count--
Headquarters--
Office Locations
--
Websitekaggle.com
Primary Industry
--
Other Industries
--
Are you managing Kaggle's employer brand? To edit company information,
claim this page for free
Managing your company's employer brand?
Claim this Company Page for FREE
Compare Kaggle with Similar Companies
Change Company | Change Company | Change Company | ||
---|---|---|---|---|
Overall Rating | - based on 0 reviews | 3.7/5 based on 92.9k reviews | 3.7/5 based on 54.7k reviews | 3.8/5 based on 58.7k reviews |
Highly Rated for | - | Job security Work-life balance | Job security | No highly rated category |
Critically Rated for | - | Promotions Salary Work satisfaction | Promotions Salary | Promotions |
Primary Work Policy | - | Work from office 80% employees reported | Hybrid 62% employees reported | Hybrid 75% employees reported |
Rating by Women Employees | - no rating available | 3.7 Good rated by 27k women | 3.8 Good rated by 15.7k women | 3.8 Good rated by 22.1k women |
Rating by Men Employees | - no rating available | 3.6 Good rated by 60.7k men | 3.7 Good rated by 36.5k men | 3.8 Good rated by 34k men |
Job security | - Data not available | 4.5 Good | 3.8 Good | 3.7 Good |
View more
Kaggle Salaries
Data Scientist
(2 salaries)
Unlock
₹2.9 L/yr - ₹3.7 L/yr
Data Engineer
(1 salaries)
Unlock
₹13.5 L/yr - ₹17.2 L/yr
Freelancer
(1 salaries)
Unlock
₹15.3 L/yr - ₹19.5 L/yr
Data Analyst Intern
(1 salaries)
Unlock
₹5.4 L/yr - ₹6.9 L/yr
Machine Learning Engineer
(1 salaries)
Unlock
₹8.3 L/yr - ₹10.6 L/yr
Jr. Data Scientist
(1 salaries)
Unlock
₹2.7 L/yr - ₹3.5 L/yr
Practitioner
(1 salaries)
Unlock
₹3.2 L/yr - ₹4.1 L/yr
Data Science Intern
(1 salaries)
Unlock
₹5.4 L/yr - ₹6.9 L/yr
Machine Learning Specialist
(1 salaries)
Unlock
₹6.8 L/yr - ₹8.6 L/yr
Kaggle News
View all
Beginner Machine Learning Project: Step-by-Step Guide to Predicting London House Prices
- The article provides a step-by-step guide on predicting London house prices using machine learning.
- The dataset used in the project is sourced from Kaggle, and the notebook is available on both Kaggle and GitHub.
- The author recommends following along with the notebook for the complete code.
- The three main steps involve dataset inspection, conversion to numpy, splitting into train and test sets, and normalization of data.
- Data exploration is crucial, including visualizing features and conducting correlation analysis.
- Standard machine learning models are deemed sufficient for the relatively linear dataset; deep learning is not considered necessary.
- Linear Regression initially showed poor results, leading to experimentation with decision tree models like Random Forest.
- Gradient Boosting was found to be the most effective model, providing a solid score that was hard to beat.
- A PyTorch neural network was also implemented, but Gradient Boosting remained the preferred choice due to its performance.
- The article emphasizes the importance of data normalization, understanding model evaluation metrics like r2 score, and optimizing hyperparameters for model performance.
Medium | 5 Apr, 2025

Multiple Linear Regression for Machine Learning: A Step-by-Step Tutorial
- Multiple linear regression models the relationship between multiple independent variables and a dependent variable.
- It involves variables like y (dependent), x1, x2, x3 (independent), β0 (intercept), and coefficients β1, β2, β3, etc.
- Applications include predicting house prices, mobile phone prices, and stock prices based on various factors.
- An example using a mobile prices dataset from Kaggle is illustrated.
- Data cleaning steps involve handling missing values, dropping rows, and encoding categorical variables.
- Normalization and conversion of features like 'Number of Ratings' and 'Price in INR' are performed.
- Splitting data into training and test sets is crucial for model evaluation.
- Initializing, training, and making predictions using the Linear Regression model are shown in the example.
- Evaluation metrics like Mean Squared Error (MSE) help assess the model's performance.
- Actual vs. predicted price outputs are demonstrated, along with model coefficients and intercept.
Self-Learning-Java | 16 Mar, 2025

How to Use Google's Gemma 3 Model for Free? | Online Gemma 3 | Free use Gemma 3 Online
- Google Colab is a cloud-based Jupyter Notebook environment for running Python code.
- To set up Gemma 3 in Google Colab, install the required dependencies and authenticate with Kaggle.
- Kaggle Notebooks also offer free access to Gemma 3 by enabling GPU and installing the necessary libraries.
- Alternatively, you can use Google AI Studio or Hugging Face to access Gemma 3 without any setup.
Medium | 16 Mar, 2025
My First Deep Learning Project: Building a Naruto Image Classifier with FastAI
- As a beginner in deep learning and anime enthusiast, creating a Naruto image classifier served as an engaging starter project.
- The project involved collecting images using the duckduckgo_search Python library and utilizing Kaggle for model building and training.
- Git was used for version control, Visual Studio for developing a user-friendly interface, and Hugging Face Spaces for deploying the model.
- Data preparation included dataset cleaning, organization, and setting up data augmentation pipelines with FastAI tools.
- Transfer learning with a pre-trained CNN, specifically ResNet34, was used for model training.
- The model achieved an accuracy of approximately 89.8% with 123 correct predictions out of 137 validation images.
- Visualizing the top images with the highest loss revealed insights for improving the model.
- Utilizing FastAI’s ImageClassifierCleaner helped enhance dataset quality by identifying and removing problematic images.
- The deployment process involved exporting the model, creating a Gradio interface, and using Hugging Face Spaces for public accessibility.
- Visual Studio Code facilitated the deployment process, offering a smooth workflow from development to deployment.
- Challenges faced during the project included dataset quality, model architecture selection, and workflow optimization.
Medium | 8 Mar, 2025

Fine-Tuning A LLM Small Practical Guide With Resources
- The author initially failed to fine-tune a DeepSeek model and a Llama model but succeeded with a Mistral 7B parameter model.
- Resources like 3blue1brown channel and 'The Hundred-Page Machine Learning Book' were recommended for AI and neural network basics.
- Fine-tuning involves retraining an existing model on a new dataset to optimize its performance for a specific task.
- Steps for fine-tuning an LLM include loading a dataset, data preprocessing, model selection, parameter configuration, training, evaluation, and inference.
- Datasets for fine-tuning can be obtained from sources like HuggingFace, Kaggle, and opendatainception.
- Popular LLM models for fine-tuning include Llama-3.1B, DeepSeek-R1-Distill-Qwen-1.5B, and Mistral-7B-v0.x.
- Google Colab and Kaggle offer free GPU usage for fine-tuning models, while companies like Salad and VastAI provide GPU rental services at competitive prices.
- Notebooks like 'Fine-tuning DeepSeek R1' and 'Alpaca + Flame fine-tuning' are available as templates for LLM fine-tuning.
- After fine-tuning, models can be deployed using tools like Gradio or platforms like HuggingFace, with options for CPU and RAM or premium plans for larger models.
Dev | 3 Mar, 2025

IDM VTON : Virtual Try On APP Automatic Installers for Windows, RunPod, Massed Compute and Kaggle notebook
- IDM VTON offers 1-click installers for Windows, RunPod, Massed Compute, and a free Kaggle account notebook.
- The APP seamlessly installs on Windows, RunPod, Massed Compute, and Kaggle, supporting Python 3.10 VENV.
- The APP includes extra features such as handling any resolution and aspect ratio images, manual masking, and automatic image saving.
- It also supports 4-bit and 8-bit quantization, CPU offloading for lower VRAM GPUs, and batch generation of multiple images.
Dev | 21 Feb, 2025

Getting started with NLP using Bert on Kaggle
- The news article is about getting started with NLP using BERT on Kaggle.
- The article covers various steps such as import and EDA, tokenization, test and validation sets, metrics and correlation, training the model, and getting predictions on the test set.
- The code imports necessary libraries and datasets from Kaggle, performs tokenization using BERT, splits the dataset into test and validation sets, defines metrics and correlation functions, trains the model, and finally generates predictions on the test set.
- The predictions are then saved in a CSV file named 'submission.csv'.
Dev | 29 Jan, 2025

[D] Any good resources for pre-trained model weights?
- Obtaining pre-trained weights for specific architectures can be challenging, even for popular ones like ResNet.
- Pre-trained models for architectures like ResNet trained on JFT-300M or Youtube100M do not release their weights.
- Existing resources like HuggingFace, Kaggle, and PyTorch offer pre-trained models, but they are typically trained on a single dataset and may not be suitable for comparison purposes.
- Finding the right place to obtain pre-trained weights for specific architectures can be challenging.
Reddit | 26 Jan, 2025
![[D] Any good resources for pre-trained model weights?](https://rdwgroup.com/wp-content/uploads/2018/10/reddit2-800x450-1.png)
AI Bootcamp Rollercoaster: My 5-Day Whirlwind with Google and Kaggle’s Gen AI Course
- The author shares their experience in Google and Kaggle's 5-day Generative AI Intensive course.
- The first day involved reading hefty whitepapers, interactive labs, a podcast, and a livestream.
- The author found the whitepapers overwhelming and turned to the podcast for guidance.
- Despite being able to pause and rewind, note-taking became challenging due to the amount of information.
Medium | 29 Nov, 2024

Powered by
Compare Kaggle with

Cognizant
3.7

Capgemini
3.7

HDFC Bank
3.9

Infosys
3.6

ICICI Bank
4.0

HCLTech
3.5

Tech Mahindra
3.5

Genpact
3.8

Teleperformance
3.9

Concentrix Corporation
3.7

Axis Bank
3.7

Amazon
4.0

Jio
4.0

iEnergizer
4.6

Reliance Retail
3.9

IBM
4.0

LTIMindtree
3.7

HDB Financial Services
3.9

Larsen & Toubro Limited
3.9

Deloitte
3.8
Edit your company information by claiming this page
Contribute & help others!
You can choose to be anonymous
Write a review
Share interview
Contribute salary
Add office photos
Companies Similar to Kaggle

Infosys
Consulting, IT Services & Consulting
3.6
• 40.8k reviews

ICICI Bank
Financial Services, Banking
4.0
• 39.2k reviews

HCLTech
Telecom, Education & Training, Hardware & Networking, Banking, Emerging Technologies, IT Services & Consulting, Software Product
3.5
• 37.5k reviews

Tech Mahindra
BPO/KPO, Consulting, Analytics & KPO, Engineering & Construction, IT Services & Consulting
3.5
• 36.3k reviews

Genpact
Financial Services, EdTech, IT Services & Consulting
3.8
• 32.7k reviews

Teleperformance
BPO, IT Services & Consulting, Software Product
3.9
• 30.7k reviews
Recently Viewed
JOBS
SRM Global
10 jobs
SALARIES
Qubole
No Salaries
LIST OF COMPANIES
LinkEZ
Overview
LIST OF COMPANIES
Mjengo
Overview
LIST OF COMPANIES
NuQare
Overview
LIST OF COMPANIES
TAMARA
Overview
REVIEWS
Swimwell
No Reviews
SALARIES
Growel
No Salaries
LIST OF COMPANIES
CallXP
Overview
REVIEWS
Admisure
No Reviews
Stay ahead in your career. Get AmbitionBox app
Helping over 1 Crore job seekers every month in choosing their right fit company
75 Lakh+
Reviews
5 Lakh+
Interviews
4 Crore+
Salaries
1 Cr+
Users/Month
Contribute to help millions
Get AmbitionBox app