Filter interviews by
I applied via Approached by Company and was interviewed in Apr 2022. There were 2 interview rounds.
Open Graph is a protocol used to add metadata to web pages, allowing them to be shared on social media platforms.
Open Graph tags are added to the head section of HTML code
They provide information about the page's title, description, image, and other attributes
Social media platforms like Facebook, Twitter, and LinkedIn use this metadata to display a preview of the shared link
Open Graph tags can improve the appearance an
Google Search Console is for website optimization while Google Analytics is for website traffic analysis.
Search Console focuses on technical SEO aspects like crawl errors, sitemaps, and indexing.
Analytics provides insights on user behavior, traffic sources, and conversions.
Search Console helps identify and fix website issues that affect search engine rankings.
Analytics helps track website performance and measure the ef...
Canonical URL is the preferred URL of a webpage that search engines should index and display in search results.
Canonical URL helps to avoid duplicate content issues.
It is specified in the HTML code of a webpage using the rel=canonical tag.
Canonical URL is useful when multiple URLs have the same or similar content.
It consolidates the ranking signals for the duplicate or similar content into a single URL.
For example, htt...
A sitemap is a file that lists all the pages of a website to help search engines crawl and index them.
Sitemaps help search engines understand the structure of a website and its content.
They can include information such as the last modified date of a page and its priority.
Sitemaps can be submitted to search engines through their webmaster tools.
XML sitemaps are the most common type of sitemap used for SEO purposes.
Sitem...
robots.txt is a file used to instruct search engine crawlers which pages or sections of a website to crawl or avoid.
It is a plain text file located in the root directory of a website.
It contains instructions for search engine crawlers on which pages or sections of a website to crawl or avoid.
It uses a specific syntax to specify user agents and disallowed pages or directories.
It can also include directives for sitemaps ...
Top trending discussions
I applied via Referral and was interviewed in Oct 2024. There were 2 interview rounds.
I applied via Job Portal and was interviewed in Dec 2024. There were 2 interview rounds.
I applied via Recruitment Consulltant
I have 5 years of experience in this job.
5 years of experience in a similar role
Managed a team of 10 employees
Implemented new strategies to improve team performance
Implementing a complex algorithm for optimizing search results in a large-scale e-commerce platform.
Developed a custom search algorithm to improve search relevance and speed
Implemented data structures like trie and inverted index for efficient search
Optimized the algorithm to handle millions of products and user queries
Used techniques like caching and parallel processing to enhance performance
I applied via Referral and was interviewed in Aug 2024. There were 3 interview rounds.
Experienced team lead with strong communication and problem-solving skills.
Over 5 years of experience leading teams in various industries
Skilled in conflict resolution and team motivation
Excellent communication and interpersonal skills
Proven track record of meeting and exceeding team goals
Strong problem-solving abilities, able to think quickly on my feet
General behavioral test
I have 3 years of Coupa support experience, including troubleshooting, training, and system configuration.
Provided technical support to users experiencing issues with Coupa platform
Conducted training sessions for new users on how to navigate and utilize Coupa features
Assisted with system configuration and customization based on user requirements
TCS
Accenture
Wipro
Cognizant