In my earlier post related to Traits in Scala, I have mentioned one point regarding limiting the access of trait to any class in Scala. Yes, it is possible in Scala to limit a trait such a way that it can be extended by very specific sub classes.
We all might know about Java Interfaces. Traits in Scala are very much similar to Java Interfaces. There is just one exception if we compare traits with interfaces in Java (Java 7 and earlier version) and that is traits can also have methods with implementation. Let’s look from different perspective. In Scala we can treat traits same as classes which doesn’t have constructors with parameters.
Being a purely object oriented language each and every value in Scala world is an object. Even all of the primitive data types of Scala are the objects. And as I also has mentioned in my earlier post related to Scala Methods, there is no native operator available in Scala. Instead, Scala uses a corresponding method which look like an operator.
Previously we have learned about Scala Programming Language, Variable Declaration and Method Declaration. Now we’ll have a closer look at the Scala classes and Objects. We’ll also learn about creating Singleton objects in Scala.
Now we know how to declare variables in Scala, Some of you now probably want try writing some methods. By default all variables passed in a method are mutable in Scala. Here are some rules for writing methods.
Scala variables are same as Java variables. Only syntax is slightly different.
Rules for Scala variable declaration
- Type declaration and semicolon are optional.
- Keyword var is used to declare the variables. Values can be reassigned.
- Keyword val is used to define the values. Values cannot be reassigned (Same as Java’s final type)
The word “Scala” is a short form of the term “Scalable Language”. Technically, Scala programming is a combination of the two style of programming approaches and those are “Object Oriented Programming” and “Functional Programming”. The Object Oriented approach makes language very easy to build large applications with big structures and architecture and Functional approach keeps design very modular and pluggable. This is how Scala language is empowered by fusion of both of these approaches.
Before we move ahead lets learn a bit on Setup Apache Spark,
So, What is Apache Spark?
Apache Spark is a fast, real time and extremely expressive computing system which executes job in distributed (clustered) environment.
It is quite compatible with Apache Hadoop and more almost 10x faster than Hadoop MapReduce on Disk Computing and 100x faster using in memory computations. It provides rich APIs in Java, Scala and Python along with Functional Programming capabilities.