
Ok, you have decided to setup a Hadoop cluster for your business. Next step now, planning the cluster… But Hadoop is a complex stack and you might have many questions: HDFS deals with replication and Map Reduce create files… How can I plan my storage needs? How to plan my CPU needs? How to plan my memory needs? Should I consider different needs on some nodes of the cluster? I heard that Map Reduce moves its job code where the data to process is located……
Lire la suite