i
London Stock Exchange Group
Filter interviews by
I appeared for an interview in Jan 2025.
I was assigned a task to create a questionnaire consisting of five questions. I need to code, build, and provide instructions on how to use it.
I want to join LSEG because of its reputation for innovation and growth in the financial industry.
LSEG is a leading global financial markets infrastructure provider
Opportunity to work with cutting-edge technology and contribute to innovative projects
Desire to be part of a dynamic and collaborative team environment
Excited about the potential for career growth and development at LSEG
I can bring strong leadership skills, technical expertise, and a collaborative approach to the team.
Strong leadership skills to guide and motivate team members
Technical expertise to provide guidance and support on development tasks
Collaborative approach to foster teamwork and communication within the team
I applied via Approached by Company and was interviewed in Oct 2024. There were 2 interview rounds.
I appeared for an interview before Apr 2024, where I was asked the following questions.
What people are saying about London Stock Exchange Group
I applied via Naukri.com and was interviewed in Apr 2024. There were 3 interview rounds.
I have 5 years of experience working as a software engineer in various tech companies.
Worked on developing web applications using Java, Spring, and Angular
Experience with database management systems like MySQL and MongoDB
Collaborated with cross-functional teams to deliver projects on time and within budget
London Stock Exchange Group interview questions for popular designations
I applied via Approached by Company and was interviewed in Dec 2024. There was 1 interview round.
Get interview-ready with Top London Stock Exchange Group Interview Questions
I applied via LinkedIn and was interviewed in Feb 2024. There were 5 interview rounds.
Given problem and asked to write tests & automate
I applied via Approached by Company and was interviewed in Apr 2024. There was 1 interview round.
I applied via Naukri.com and was interviewed in Oct 2023. There were 4 interview rounds.
I applied via Referral and was interviewed in Oct 2023. There were 2 interview rounds.
Data can be loaded from JSON using Snowflake's COPY INTO command.
Use the COPY INTO command in Snowflake to load data from JSON files.
Specify the file format as JSON in the COPY INTO command.
Map the JSON attributes to the columns in the target table.
Example: COPY INTO target_table FROM 's3://bucket_name/file.json' FILE_FORMAT = (TYPE = 'JSON');
Performance tuning options in Snowflake include clustering, materialized views, query profiling, and resource monitoring.
Use clustering keys to organize data for faster query performance
Create materialized views to pre-aggregate data and improve query speed
Utilize query profiling to identify and optimize slow queries
Monitor resource usage to ensure efficient query execution
Snowpipe is configured using a Snowflake account, specifying the source data location and the target table.
Configure a stage in Snowflake to specify the source data location.
Create a pipe in Snowflake to define the target table and the stage.
Set up notifications for the pipe to trigger loading data automatically.
Monitor the pipe for any errors or issues in data loading.
Example: CREATE STAGE my_stage URL = 's3://my_buck...
Various data modelling techniques like dimensional modelling, ER modelling, and data vault are used.
Dimensional modelling is used for data warehousing and involves organizing data into facts and dimensions.
ER modelling is used to visualize the data relationships in an entity-relationship diagram.
Data vault modelling is used for agile data warehousing and involves creating a flexible and scalable data model.
Streams in Snowflake are used to continuously replicate data from a table to another destination in real-time.
Streams capture changes made to a table, such as inserts, updates, and deletes.
They can be used to track changes and replicate data to other tables or external systems.
Streams are created on a specific table and can be monitored for changes using SQL commands.
I applied via Referral and was interviewed in Oct 2023. There were 2 interview rounds.
Streams in Snowflake are continuous flows of data that can be consumed in real-time for processing and analysis.
Streams capture changes made to a table and make them available for processing in real-time.
They can be used to implement CDC (Change Data Capture) solutions.
Streams can be created using the CREATE STREAM statement.
Example: CREATE STREAM my_stream ON TABLE my_table;
Data migration from on-premises to cloud data warehouse involves transferring data from local servers to a cloud-based storage solution.
Assess current data sources and structures on-premises
Select appropriate cloud data warehouse solution (e.g. Snowflake)
Plan data migration strategy including data extraction, transformation, and loading (ETL)
Test data migration process thoroughly before full implementation
Monitor and o
Clustering keys in Snowflake help improve query performance by organizing data in a specific order.
Clustering keys determine the physical order of data in Snowflake tables.
They are defined at the table level and can be set during table creation or altered later.
Clustering keys can be single or composite, and should be chosen based on the most commonly used columns in queries.
They help reduce the amount of data scanned ...
The duration of London Stock Exchange Group interview process can vary, but typically it takes about less than 2 weeks to complete.
based on 31 interviews
Interview experience
based on 911 reviews
Rating in categories
Content Analyst
1.3k
salaries
| ₹2.5 L/yr - ₹7.5 L/yr |
Associate Content Analyst
899
salaries
| ₹2.2 L/yr - ₹6 L/yr |
Senior Content Analyst
453
salaries
| ₹4 L/yr - ₹9.5 L/yr |
Research Analyst
268
salaries
| ₹2.8 L/yr - ₹7 L/yr |
Associate Research Analyst
233
salaries
| ₹2.2 L/yr - ₹4.6 L/yr |
Wells Fargo
HSBC Group
Cholamandalam Investment & Finance
Citicorp