Is It Mandatory to Start Hadoop to Run Spark Application | Hadoop Interview Questions and Answers

Published: 03 December 2018
on channel: ACADGILD
486
2

Is It Mandatory to Start Hadoop to Run Spark Application | Hadoop Interview Questions and Answers
https://acadgild.com/big-data/big-dat...
Welcome back to Apache Spark interview questions and answers powered by Acadgild. Here in this video, Mr. Sudhanshu, a Data Scientist, will explain you the Hadoop interview questions and answers specifically on apache spark. If you have missed the master video of interview question and answers kindly, click the following link.
Top 20 Apache Spark Interview Questions -    • Top 20 Apache Spark Interview Questio...  
In this video the mentor will explain, how Apache spark store data. So, here is the interview question,
Is It Mandatory to Start Hadoop to Run Spark Application?
No! Spark supports a standard mode as well as a cluster mode. So we can configure and execute our spark program into windows machine as well as in an individual Linux machine or mac machine on top of a cluster where we will be using the bunch of machine but in terms of storage there is no separate storage in spark.
We can load the data from local system and process it, Hadoop or HDFS is not mandatory to run spark application
Thank you for watching the video. Please like, comment and subscribe the channel for more videos.
For more updates on courses and tips follow us on:
Facebook:   / acadgild  
Twitter:   / acadgild  
LinkedIn:   / acadgild  


Watch video Is It Mandatory to Start Hadoop to Run Spark Application | Hadoop Interview Questions and Answers online without registration, duration hours minute second in high quality. This video was added by user ACADGILD 03 December 2018, don't forget to share it with your friends and acquaintances, it has been viewed on our site 486 once and liked it 2 people.