Neudesic Technologies
10+ Aakash Exploration Services Interview Questions and Answers
Q1. how do you create forms in react?
Forms in React can be created using controlled components, state management, and event handling.
Use controlled components to manage form data through state
Handle form submission using event handlers like onSubmit
Utilize form elements like input, textarea, and select to collect user input
Q2. How you will deploy application to Azure if you have created Azure function using visual studio ?
Deploying Azure function created in Visual Studio to Azure.
Publish the function from Visual Studio to Azure portal.
Create a new function app in Azure portal.
Configure the function app settings and connection strings.
Deploy the function code to the function app.
Test the function in Azure portal.
Q3. What are the design patterns in c#
Design patterns in C# are reusable solutions to common problems in software design.
Some common design patterns in C# include Singleton, Factory, Observer, and Strategy.
Singleton pattern ensures a class has only one instance and provides a global point of access to it.
Factory pattern creates objects without specifying the exact class of object that will be created.
Observer pattern defines a one-to-many dependency between objects so that when one object changes state, all its d...read more
Q4. What are the varios fields available in the Azure functions deployment using visual studio ?
Azure functions deployment fields in Visual Studio
Function name
Trigger type
Input and output bindings
Environment variables
Application settings
Connection strings
Q5. How to copy multiple csv into single file
Use a command line tool like cat to concatenate multiple CSV files into a single file
Use the cat command in the terminal to concatenate multiple CSV files into a single file
Navigate to the directory where the CSV files are located
Run the command 'cat file1.csv file2.csv > combined.csv' to merge file1.csv and file2.csv into a new file named combined.csv
Q6. How to Load 1 million records and less time ADF pipeline optimisations
Optimize ADF pipeline to load 1 million records faster
Use parallel execution to load data in chunks
Leverage partitioning to distribute data processing
Optimize data flow by eliminating unnecessary transformations
Use efficient data sources and destinations
Consider using Azure Data Factory Data Flow for better performance
Q7. What is DevOps according to you?
DevOps is a software development approach that emphasizes collaboration, communication, and automation.
DevOps is a combination of development and operations teams working together to deliver software quickly and reliably.
It involves automating the software delivery process to reduce errors and increase efficiency.
DevOps also emphasizes continuous feedback and improvement to ensure that software meets the needs of users and the business.
Tools commonly used in DevOps include ve...read more
Q8. What is hdinsights in azure data factory
HDInsight is a cloud-based service in Azure that makes it easy to process big data using Apache Hadoop, Spark, and other tools.
HDInsight is a fully managed cloud service that makes it easy to process big data using open-source frameworks like Apache Hadoop, Spark, and more.
It allows you to create, scale, and monitor Hadoop clusters in Azure.
HDInsight integrates with Azure Data Factory to provide data orchestration and movement capabilities for big data workflows.
You can use H...read more
Q9. How to perform data copy in azure
Data copy in Azure can be performed using Azure Data Factory or Azure Storage Explorer.
Use Azure Data Factory to create data pipelines for copying data between various sources and destinations.
Use Azure Storage Explorer to manually copy data between Azure storage accounts.
Utilize Azure Blob Storage for storing the data to be copied.
Q10. What are the different types of Cloud?
There are three types of Cloud: Public, Private, and Hybrid.
Public Cloud: Services provided by third-party providers over the internet.
Private Cloud: Services provided by a single organization for internal use.
Hybrid Cloud: Combination of public and private cloud services.
Examples: AWS, Microsoft Azure, Google Cloud Platform.
Public cloud examples: Dropbox, Gmail, Salesforce.
Private cloud examples: Bank of America, NASA, US Army.
Hybrid cloud examples: Netflix, Cisco, IBM.
Q11. Dynamic file ingestion in ADF
Dynamic file ingestion in ADF involves using parameters to dynamically load files into Azure Data Factory.
Use parameters to specify the file path and name dynamically
Utilize expressions to dynamically generate file paths
Implement dynamic mapping data flows to handle different file structures
Q12. agregation in databricks
Aggregation in Databricks refers to the process of combining and summarizing data in a Spark environment.
Aggregation functions like sum, count, avg, min, max can be used in Databricks to summarize data.
Grouping data based on certain columns and then applying aggregation functions is a common practice.
Aggregation can be performed on large datasets efficiently using Databricks' distributed computing capabilities.
Q13. synthetic events in react
Synthetic events in React are custom events created by developers to handle specific interactions.
Synthetic events are created using the SyntheticEvent object provided by React.
Developers can use synthetic events to handle user interactions like clicks, key presses, etc.
Synthetic events have the same interface as native browser events but are normalized across different browsers.
Q14. What are linked service What is data set Function vs SP
Linked services are connections to external data sources in Azure Data Factory. Data sets are representations of data in those sources. Functions and stored procedures are used for data transformation.
Linked services are connections to external data sources such as databases, file systems, or APIs.
Data sets are representations of data in those sources, specifying the location, format, and schema of the data.
Functions are reusable code snippets used for data transformation and...read more
Q15. Agile process followed in last project.
Agile methodology was followed in the last project.
The project was divided into sprints of 2 weeks each.
Daily stand-up meetings were conducted to discuss progress and roadblocks.
Backlog grooming sessions were held to prioritize tasks.
Retrospective meetings were conducted at the end of each sprint to identify areas of improvement.
The team used JIRA for project management and tracking progress.
Q16. Prepare user story for patient registration system
User story for patient registration system in a healthcare setting
As a new patient, I want to be able to easily register online before my appointment.
The system should collect basic information such as name, contact details, and insurance information.
Patients should be able to schedule appointments and update their information as needed.
The system should send confirmation emails and reminders for appointments.
Staff should have access to patient records for check-in and follow...read more
Q17. Current project architecture
Our current project architecture follows a microservices approach.
We have divided our application into smaller, independent services.
Each service has its own database and communicates with other services through APIs.
We use Docker and Kubernetes for containerization and orchestration.
We also have a centralized configuration server for managing configurations.
We follow RESTful API design principles for communication between services.
Top HR Questions asked in Aakash Exploration Services
Interview Process at Aakash Exploration Services
Top Interview Questions from Similar Companies
Reviews
Interviews
Salaries
Users/Month