i
London Stock Exchange Group
Filter interviews by
#0 Minutes Group discussion on Random Topics
25 Objective questions
Essay Writing to check Language competencies
I was interviewed in Sep 2024.
Implementing a Kafka-like queue involves creating a distributed messaging system for handling large volumes of data.
Use Apache Kafka or another messaging system as a base for understanding the architecture.
Design a system with topics, partitions, producers, and consumers.
Implement fault tolerance and scalability features like replication and partitioning.
Ensure high throughput and low latency for message processing.
Con...
I applied via Approached by Company and was interviewed in Apr 2024. There was 1 interview round.
I applied via Job Portal and was interviewed in Mar 2024. There were 3 interview rounds.
London Stock Exchange Group interview questions for popular designations
Financial marketing techniques involve strategies and tactics used to promote financial products and services.
Targeted advertising to reach specific demographics
Content marketing through blogs, articles, and social media
Email marketing campaigns to engage and inform customers
Search engine optimization (SEO) to improve online visibility
Influencer marketing to leverage the influence of industry experts
Direct mail campaig...
Get interview-ready with Top London Stock Exchange Group Interview Questions
I applied via campus placement at University Vishveshvaraya College of Engineering (UVCE) and was interviewed in Aug 2024. There were 3 interview rounds.
30 minutes 30 question
3 easy and medium level
It went for like 45mins
I applied via Naukri.com and was interviewed in Oct 2023. There were 4 interview rounds.
I applied via Referral and was interviewed in Oct 2023. There were 2 interview rounds.
Data can be loaded from JSON using Snowflake's COPY INTO command.
Use the COPY INTO command in Snowflake to load data from JSON files.
Specify the file format as JSON in the COPY INTO command.
Map the JSON attributes to the columns in the target table.
Example: COPY INTO target_table FROM 's3://bucket_name/file.json' FILE_FORMAT = (TYPE = 'JSON');
Performance tuning options in Snowflake include clustering, materialized views, query profiling, and resource monitoring.
Use clustering keys to organize data for faster query performance
Create materialized views to pre-aggregate data and improve query speed
Utilize query profiling to identify and optimize slow queries
Monitor resource usage to ensure efficient query execution
Snowpipe is configured using a Snowflake account, specifying the source data location and the target table.
Configure a stage in Snowflake to specify the source data location.
Create a pipe in Snowflake to define the target table and the stage.
Set up notifications for the pipe to trigger loading data automatically.
Monitor the pipe for any errors or issues in data loading.
Example: CREATE STAGE my_stage URL = 's3://my_buck...
Various data modelling techniques like dimensional modelling, ER modelling, and data vault are used.
Dimensional modelling is used for data warehousing and involves organizing data into facts and dimensions.
ER modelling is used to visualize the data relationships in an entity-relationship diagram.
Data vault modelling is used for agile data warehousing and involves creating a flexible and scalable data model.
Streams in Snowflake are used to continuously replicate data from a table to another destination in real-time.
Streams capture changes made to a table, such as inserts, updates, and deletes.
They can be used to track changes and replicate data to other tables or external systems.
Streams are created on a specific table and can be monitored for changes using SQL commands.
I applied via Referral and was interviewed in Oct 2023. There were 2 interview rounds.
Streams in Snowflake are continuous flows of data that can be consumed in real-time for processing and analysis.
Streams capture changes made to a table and make them available for processing in real-time.
They can be used to implement CDC (Change Data Capture) solutions.
Streams can be created using the CREATE STREAM statement.
Example: CREATE STREAM my_stream ON TABLE my_table;
Data migration from on-premises to cloud data warehouse involves transferring data from local servers to a cloud-based storage solution.
Assess current data sources and structures on-premises
Select appropriate cloud data warehouse solution (e.g. Snowflake)
Plan data migration strategy including data extraction, transformation, and loading (ETL)
Test data migration process thoroughly before full implementation
Monitor and o
Clustering keys in Snowflake help improve query performance by organizing data in a specific order.
Clustering keys determine the physical order of data in Snowflake tables.
They are defined at the table level and can be set during table creation or altered later.
Clustering keys can be single or composite, and should be chosen based on the most commonly used columns in queries.
They help reduce the amount of data scanned ...
I applied via campus placement at Amrita School of Engineering, Bangalore and was interviewed in Jul 2023. There were 5 interview rounds.
The aptitude test was of 30 minutes with 30 questions. There were questions from all domains, like Java programming, C programming, verbal, aptitude, core computer science concepts, and so on.
The coding test had 3 questions to be solved in 60 mins. The questions were easy-medium level, but the implementations were lengthy. So one needed to have a fast typing skills to complete the implementation.
The group discussion went around for 35 mins, which was supposed to be 30 mins. There were 7 people on our team and we were provided a couple of sheets of paper where there were instructions written on what to do and how to do it. In the end, we had a presentation session of 5 mins which was prolonged till 10 mins. The topic was, "The company wants to introduce crypto-currency in their trading system. What should be the business and technical planning strategies for the implementation."
Top trending discussions
The duration of London Stock Exchange Group interview process can vary, but typically it takes about less than 2 weeks to complete.
based on 118 interviews
Interview experience
based on 864 reviews
Rating in categories
Content Analyst
1.3k
salaries
| â‚ą2.5 L/yr - â‚ą7.2 L/yr |
Associate Content Analyst
894
salaries
| â‚ą2.2 L/yr - â‚ą6 L/yr |
Senior Content Analyst
451
salaries
| â‚ą4 L/yr - â‚ą9.5 L/yr |
Research Analyst
264
salaries
| â‚ą2.8 L/yr - â‚ą7 L/yr |
Associate Research Analyst
226
salaries
| â‚ą2.2 L/yr - â‚ą5.5 L/yr |
National Stock Exchange of India
Bombay Stock Exchange
Multi Commodity Exchange of India
HDFC Bank