Package Details: apache-spark 2.4.0-1

Git Clone URL: (read-only)
Package Base: apache-spark
Description: fast and general engine for large-scale data processing
Upstream URL:
Licenses: Apache
Submitter: huitseeker
Maintainer: lukaszimmermann
Last Packager: huitseeker
Votes: 40
Popularity: 0.055285
First Submitted: 2015-10-04 09:31
Last Updated: 2019-01-30 19:46

Dependencies (9)

Required by (0)

Sources (7)

Latest Comments

1 2 3 4 5 Next › Last »

lukaszimmermann commented on 2019-04-26 09:29

Works, thanks!

blurbdust commented on 2019-04-26 00:31

Here is my current PKGBUILD that works for the latest spark as on 4/23/19

lukaszimmermann commented on 2019-04-25 09:14

I would be happy to become maintainer of this package, then I can take care of some open issues here.

apetresc commented on 2019-01-30 02:44

The package is so out of date that the mirror no longer carries the source package corresponding to 2.3.1, so it fails to build now.

If 2.4.0 still requires additional testing that you don't have time to do yet, then please at least bump to version 2.3.2 which is a drop-in replacement. The patch is trivial.

MangoMan commented on 2019-01-14 09:31

I concur with @wenbushi that hadoop should be dependency since installing hadoop resolved the previous JNI errors.

wenbushi commented on 2018-12-02 01:55

I've also got the missing org/slf4j/Logger error even after installing the slf4j AUR package manually. Maybe it should be specified that hadoop is a required dependency.

S3ppuku commented on 2018-10-28 13:23

I faced the same issue (the NoClassDefFoundError: org.slf4j/Logger) and adding the missing jars does not make Spark usable anyway (the run-master or start-master scripts still fail to launch Spark correctly). After digging up a bit more, It seems that Spark still relies on Hadoop even when using spark-2.3.1-bin-without-hadoop.tgz archive. Switching to the Spark archive built for hadoop 2.7, all the needed jars for Spark are indeed installed along with Spark.

Ichimonji10 commented on 2018-10-23 02:13

I don't have hadoop installed, and I also get an error about org.slf4j.Logger. Here's a snippet of the error:

$ spark-shell                                                                                                        
Error: A JNI error has occurred, please check your installation and try again                                        
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger                                          
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger                                                        

Of interest is that installing the slf4j AUR package doesn't solve the issue.

arch_nomad commented on 2018-06-17 10:00

I'm new to this whole arch AUR scene so sorry for the request because otherwise I would do it my self if I knew how to.

Please update the spark version to 2.3.1.

Thanks in advance

wasperen commented on 2018-03-31 15:59

One needs to install hadoop from the AUR to run this, I think. Without it, it complains about missing org.slf4j.Logger.