Scala Data Types by Code Example

Being a purely object oriented language each and every value in Scala world is an object. Even all of the primitive data types of Scala are the objects. And as I also has mentioned in my earlier post related to Scala Methods, there is no native operator available in Scala. Instead, Scala uses a corresponding method which look like an operator. Continue reading “Scala Data Types by Code Example”

Scala Classes, Objects & Singleton Objects

Previously we have learned about Scala Programming Language, Variable Declaration and Method Declaration. Now we’ll have a closer look at the Scala classes and Objects. We’ll also learn about creating Singleton objects in Scala.

Similar to Java, classes in Scala describes the objects. Classes may contain reference variables, methods as well as constructors. Continue reading “Scala Classes, Objects & Singleton Objects”

Scala Programming Language : Introduction

The word “Scala” is a short form of the term “Scalable Language”. Technically, Scala programming is a combination of the two style of programming approaches and those are “Object Oriented Programming” and “Functional Programming”. The Object Oriented approach makes language very easy to build large applications with big structures and architecture and Functional approach keeps design very modular and pluggable. This is how Scala language is empowered by fusion of both of these approaches. Continue reading “Scala Programming Language : Introduction”

Simple explanation of Hadoop Core Components : HDFS and MapReduce

Before this post we have discussed about what is Hadoop and what kind of issues are solved by Hadoop. Now Let’s deep dive in to various components of Hadoop. Hadoop as a whole distribution provides only two core components and HDFS (which is Hadoop Distributed File System) and MapReduce (which is a distributed batch processing framework). And a complete bunch of machines which are running HDFS and MapReduce are known as Hadoop Cluster.

As you add more nodes in Hadoop Cluster the performance of your cluster will increase which means that Hadoop is Horizontally Scalable.  Continue reading “Simple explanation of Hadoop Core Components : HDFS and MapReduce”

Hadoop Core Concepts : The Bazics

In my earlier post about Brief Introduction of Hadoop, we have understood “What is Hadoop and What kind of problems it solves”. The next step is to understand Hadoop Core Concepts which talks more about,

  • Distributed system design
  • How data is distributed across multiple systems
  • What are the different components involved and how they communicate with each others

Continue reading “Hadoop Core Concepts : The Bazics”