Normally we create Spark Application JAR using Scala and SBT (Scala Build Tool). In my previous post on Creating Multi-node Spark Cluster we have executed a word count example using spark shell. As an extension to that, we’ll learn about How to create Spark Application JAR file with Scala and SBT? and How to execute it as a Spark Job on Spark Cluster?
You will be surprised to know that you can also create Scala script and execute it on Windows OS(as a bat script) as well as on Linux/Unix OS(as a shell script) without any compilation.
Scala Exception handling is mostly very similar with Java exception handling except catch block syntax. Here I am just trying you to explain Scala exception handling concept in brief as it is a very vast topic in itself.
I am sure that if you have come across Java programming language, you might have used switch case statements in Java. Scala pattern matching expression is quite similar with Java switch-case statements, and provides far more features than that.
While programming in Java(earlier versions the Java 8), majority of us have come across a situation here we get
NullPointerException. It happens because many times we write a code without checking all possibilities and method returns null value under certain conditions.