Big Data Processing: MapReduce & Hadoop


When: Mon, August 17, 9am – Tue, August 18, 5pm, 2015
Where: Makmal Lanjutan, Level 2, Block A, Faculty of Computer Science and Information Technology, University Of Malaya, Kuala Lumpur, Malaysia
Admission: MYR 350
Info: Brochure | Register | Venue | Streetview

The purpose of this workshop is to provide basic training in understanding the core techniques and concepts of the Big Data and Hadoop ecosystems, set up and configure a single-node and pseudo-distributed Hadoop installation, work with Distributed File System (HDFS) and write MapReduce programs.

Hadoop is a game changer for all those companies working with Big Data. It brings together large pools of data, stores and analyses it. Big enterprises like Amazon and IBM have embraced this technology, hence making accurate analyses and better decisions. Hadoop is an open-source Apache Software Foundation project written in Java that enables the distributed processing of large datasets across clusters of commodity. Hadoop has two primary components, namely, HDFS and MapReduce programming framework.

Related Events
Link Aug 03  Android Development for Beginners 5 Days @ KL (MYR 500)
Link Aug 08  Introduction to Javascript 2 Days @ Cyberjaya (MYR 50)

[Technical courses similar to those listed above are normally offered at MYR 2000+.]