Hadoop achieves reliability by replicating
the data across multiple hosts and hence does
not require ________ storage on hosts.
a) RAID
b) Standard RAID levels
c) ZFS
d) Operating system

1 Answer

Answer :

RAID

Related questions

Description : IBM and ________ have announced a major initiative to use Hadoop to support university courses in distributed computer programming. a) Google Latitude b) Android (operating system) c) Google Variations d) Google

Last Answer : Google

Description : Which of the following standard connect distributed hosts or tenants to their provisioned storage in the cloud? a) CDMI b) OCMI c) COA d) All of the mentioned

Last Answer : CDMI

Description : Which of the following standard connect distributed hosts or tenants to their provisioned storage in the cloud? a) CDMI b) OCMI c) COA d) All of the mentioned

Last Answer : CDMI

Description : The Hadoop list includes the HBase database, the Apache Mahout ________ system, and matrix operations. a) Machine learning b) Pattern recognition c) Statistical classification d) Artificial intelligence

Last Answer : Machine learning

Description : What are two common concerns in a cloud environment? (Choose two.) A. Inability to use proxy servers and load balancers. B. Not enough computing capacity during peak usage times. C. Illegal access ... same physical ethernet, they should not be able to read or modify each other's network traffic.

Last Answer : Illegal access to private data stored by applications in the cloud.If two guest instances are running on the same host or on different hosts connected to the same physical ethernet, they should not be able to read or modify each other's network traffic

Description : Sun also has the Hadoop Live CD ________ project, which allows running a fully functional Hadoop cluster using a live CD. a) OpenOffice.org b) OpenSolaris c) GNU d) Linux

Last Answer : OpenSolaris

Description : ________ is a utility which allows users to create and run jobs with any executables as the mapper and/or the reducer. a) Hadoop Strdata b) Hadoop Streaming c) Hadoop Stream d) None of the mentioned

Last Answer : Hadoop Streaming

Description : ________ is a utility which allows users to create and run jobs with any executables as the mapper and/or the reducer. a) Hadoop Strdata b) Hadoop Streaming c) Hadoop Stream d) None of the mentioned

Last Answer : Hadoop Streaming

Description : A Solution to overcome the challenge to support a large number of application service consumers from around the world, cloud infrastructure providers (i.e., IaaS providers) have established data ... ) Global exchange of cloud resources b) Resource provisioning c) Cloud security d) Resource sharing

Last Answer : Global exchange of cloud resources

Description : In HDFS, reliability of storing data was maintained by a) Block Operations b) Block Replication c) Block Storage d) All the above

Last Answer : Block Replication

Description : Point out the wrong statement. a) Load balancing virtualizes systems and resources by mapping a logical address to a physical address b) Multiple instances of various Google applications are running on different hosts c) Google uses hardware virtualization d) All of the mentioned

Last Answer : Google uses hardware virtualization

Description : Point out the wrong statement. a) Load balancing virtualizes systems and resources by mapping a logical address to a physical address b) Multiple instances of various Google applications are running on different hosts c) Google uses hardware virtualization d) All of the mentioned

Last Answer : Google uses hardware virtualization

Description : .In computing, ________ improves the distribution of workloads across multiple computing resources, such as computers, a computer cluster, network links, central processing units, or disk drives. a) Virtual machine b) Virtual computing c) Virtual cloud d) Load balancer

Last Answer : Load balancer

Description : Point out the correct statement. a) Hadoop is an ideal environment for extracting and transforming small volumes of data b) Hadoop stores data in HDFS and supports data compression/decompression c) The ... less useful than a MapReduce job to solve graph and machine learning d) None of the mentioned

Last Answer : Hadoop stores data in HDFS and supports data compression/decompression

Description : Point out the correct statement. a) Hadoop is an ideal environment for extracting and transforming small volumes of data b) Hadoop stores data in HDFS and supports data compression/decompression c) The ... useful than a MapReduce job to solve graph and machine learning d) None of the mentioned

Last Answer : Hadoop stores data in HDFS and supports data compression/decompression

Description : Point out the correct statement. a) Hadoop is an ideal environment for extracting and transforming small volumes of data b) Hadoop stores data in HDFS and supports data compression/decompression c) The ... useful than a MapReduce job to solve graph and machine learning d) None of the mentioned

Last Answer : Hadoop stores data in HDFS and supports data compression/decompression

Description : Which one of the following statements is false regarding the Distributed Cache? a) The Hadoop framework will ensure that any files in the Distributed Cache are distributed to all map and reduce tasks. b) ... Cache on to the slave node before any tasks for the job are executed on that node.

Last Answer : MapReduce tries to place the data and the compute as close as possible

Description : In a Hadoop cluster, what is true for a HDFS block that is no longer available due to disk corruption or machine failure? a)It is lost for ever b)It can be replicated form its alternative locations ... to read it. d)The Mapreduce job process runs ignoring the block and the data stored in it.

Last Answer : It can be replicated form its alternative locations to other live machines.

Description : The client reading the data from HDFS filesystem in Hadoop a) gets the data from the namenode b) gets the block location from the datanode c) gets only the block locations form the namenode d) gets both the data and block location from the namenode

Last Answer : gets only the block locations form the namenode

Description : _________ software library to write and run large user applications on vast data sets in business applications a) Apache Tomcat b) Hadoop c) Open Stack d) Open Nebula

Last Answer : Hadoop

Description : __________ allows multiple operating system instances to run as guest on the same server. A. Server. B. Hypervisor. C. Network. D. Data.

Last Answer : Hypervisor.

Description : Which of the following genres does Hadoop produce? a) Distributed file system b) JAX-RS c) Java Message Service d) Relational Database Management System

Last Answer : Distributed file system

Description : Very large sustainable reading and writing bandwidth, mostly continuous accessing instead of random accessing. The programming interface is similar to that of the POSIX file system accessing interface. This is ... : Google File System c) HDFS: Hadoop Distributes File System d) None of the above

Last Answer : GFS: Google File System

Description : Which operating system supports the ZFS file system?

Last Answer : Sun's Solaris OS supports the ZFS file system natively.

Description : A ________ cloud requires virtualized storage to support the staging and storage of data. a) soft b) compute c) local d) none of the mentioned

Last Answer : compute

Description : Which of the following can be identified as cloud? a) Web Applicationsb) Intranet c) Hadoop d) All of the mentioned

Last Answer : Hadoop

Description : The Hadoop MapReduce framework spawns one map task for each __________ generated by the InputFormat for the job. a) OutputSplit b) InputSplit c) InputSplitStream d) All of the mentioned

Last Answer : ) InputSplit

Description : Which of the following platforms does Hadoop run on? a) Bare metal b) Debian c) Cross-platform d) Unix-like

Last Answer : Cross-platform

Description : What was Hadoop written in? a) Java (software platform) b) Perl c) Java (programming language) d) Lua (programming language)

Last Answer : Java (programming language)

Description : What license is Hadoop distributed under? a) Apache License 2.0 b) Mozilla Public License c) Shareware d) Commercia

Last Answer : Apache License 2.0

Description : Which of the following is built on top of a Hadoop framework using the Elastic Compute Cloud? a) Amazon Elastic MapReduce b) Amazon Mechanical Turkc) Amazon DevPay d) Multi-Factor Authentication

Last Answer : Amazon Elastic MapReduce

Description : Which of the following can be identified as cloud? a) Web Applications b) Intranet c) Hadoop d) All of the mentioned

Last Answer : Hadoop

Description : During Safemode Hadoop cluster is in a) Read-only b) Write-only c) Read-Write d) None of the above

Last Answer : Read-only

Description : Which of the following command is used to enter Safemode a) hadoop dfsadmin –safemode get b) bin dfsadmin –safemode get c) hadoop dfsadmin –safemode enter d) None of the above

Last Answer : hadoop dfsadmin –safemode enter

Description : What are the advantages of HDFS federation in Hadoop? a) Isolation b) Namespace scalability c) Improves throughput d) All of the above

Last Answer : All of the above

Description : Under Hadoop High Availability, Fencing means a)Preventing a previously active namenode from start running again. b)Preventing the start of a failover in the event of network failure with the ... previously active namenode. d)Preventing a previously active namenode from writing to the edit log.

Last Answer : Preventing a previously active namenode from writing to the edit log.

Description : The topmost layer of Hadoop is the _________ engine a) HDFS b) Cluster c) MapReduce d) Job Tracker

Last Answer : MapReduce

Description : In Hadoop, the files are stored in a) Directory b) DFS c) GFS d) HDFS

Last Answer : HDFS

Description : Hadoop was originally developed by a) Microsoft b) Google c) Yahoo d) IBM

Last Answer : Yahoo

Description : Bit map terminal A) support display containing multiple windows B) require considerable amount of video RAM C) requires tremendous amount of copying and hence low performance D) all of above

Last Answer : Answer : D

Description : Point out the wrong statement. a) Large cloud providers with geographically dispersed sites worldwide, therefore, achieve reliability rates that are hard for private systems to achieve b) Private data ... ) A network backbone is a very low-capacity network connection d) None of the mentioned

Last Answer : A network backbone is a very low-capacity network connection

Description : Which of the following statements about Google App Engine (GAE) is INCORRECT. A. It's a Platform as a Service (PaaS) model. B. Automatic Scalability is built in with GAE. As ... s applications. So, applications can take advantage of reliability, performance and security of Google's infrastructure.

Last Answer : You can decide on how many physical servers required for hosting your application.

Description : Point out the correct statement. a) A virtual machine is a computer that is walled off from the physical computer that the virtual machine is running on b) Virtual machines provide the ... that having resources indirectly addressed means there is some level of overhead d) All of the mentioned

Last Answer : All of the mentioned

Description : ________ captive requires that the cloud accommodate multiple compliance regimes. a) Licensed b) Policy-based c) Variable d) All of the mentioned

Last Answer : Policy-based

Description : Cloud computing shifts capital expenditures into ________ expenditures. a) operating b) service c) local d) none of the mentioned

Last Answer : operating

Description : . ________ is an online storage drive that can browsed and from which items can be shared. A. Find My iPhone B. iWeb Publish C. MobileMe Gallery D. iDisk

Last Answer : iDisk

Description : A ________ is a logical unit that serves as the target for storage operations, such as the SCSI protocol READs and WRITEs. a) GETs b) PUN c) LUN d) All of the mentioned

Last Answer : LUN

Description : The addition of a software package on top of a cloud storage volume makes most cloud storage offerings conform to a ________ as a Service model. a) Software b) Platform

Last Answer : ) Software

Description : The service FreeDrive is storage that allows ________ users to view the content of others. a) Facebook b) Twitter c) Whatsapp d) None of the mentioned Answer: a Explanation: FreeDrive is a p

Last Answer : Facebook

Description : The addition of a software package on top of a cloud storage volume makes most cloud storage offerings conform to a ________ as a Service model. a) Software b) Platform c) Analytics d) None of the mentioned

Last Answer : Software