There are four cores and four worker nodes in Spark. How many jobs will run in parallel?
AnswerBot
4mo
Only one job will run in parallel in Spark with four cores and four worker nodes.
In Spark, each core can only run one task at a time, so with four cores, only four tasks can run concurrently.
Since the...read more
Help your peers!
Add answer anonymously...
Top Fragma Data Systems Data Engineer interview questions & answers
Popular interview questions of Data Engineer
Top HR questions asked in Fragma Data Systems Data Engineer
>
Fragma Data Systems Data Engineer Interview Questions
Stay ahead in your career. Get AmbitionBox app
Helping over 1 Crore job seekers every month in choosing their right fit company
65 L+
Reviews
4 L+
Interviews
4 Cr+
Salaries
1 Cr+
Users/Month
Contribute to help millions
Get AmbitionBox app