Would not python with numpy be a better fit ? or fortran with some handwave interface code
Back (early 80;s) when I did map reduce we used PL1/G
From a business standpoint though, there are a few main reasons:
–Data pipelines are well modeled as functions: they take a few input datasets, return a few outputs at the end, and do a ton of processing in between
–FP idioms generally make parallelization easier, and this is very important for the datasets we're dealing with
–A strong type system like Scala's lets us prevent many runtime errors, which is quite important when your pipelines can take several hours
–It's fairly trivial to wrap a statistical/ML algorithm in a pure functional interface, even if the algorithm itself is imperative
For example i've found that as a pipeline gets optimized for production use it needs to preallocate all of its output space and then modify things in at each step (like a one hot encoder flipping a few bits in specific rows of a zeroed array instead of allocating new ones and copying them in).
I find it difficult to reconcile this sort of code with a "pure functions without side effects" philosophy and still have it perform an an acceptable level.
In jobs that were heavy on ML, I would use high-performance tools for the models (imperative code, numeric computing packages etc.) and functional code for the ETL, which worked pretty well–no need to be dogmatic about it, a 70% pure codebase is still generally easier to reason about than a 20% pure codebase.
Therefore an array of boxed types will not be memory aligned; and any vector instructions (which are very important to scientific computing) cannot be used.
Perhaps something has changed in Scala land since I last looked(??).