Package Details: apache-spark 2.3.1-1

Git Clone URL: https://aur.archlinux.org/apache-spark.git (read-only)
Package Base: apache-spark
Description: fast and general engine for large-scale data processing
Upstream URL: http://spark.apache.org
Licenses: Apache
Submitter: huitseeker
Maintainer: huitseeker
Last Packager: huitseeker
Votes: 40
Popularity: 1.511244
First Submitted: 2015-10-04 09:31
Last Updated: 2018-07-23 18:58

Dependencies (9)

Required by (0)

Sources (7)

Latest Comments

1 2 3 4 Next › Last »

wenbushi commented on 2018-12-02 01:55

I've also got the missing org/slf4j/Logger error even after installing the slf4j AUR package manually. Maybe it should be specified that hadoop is a required dependency.

S3ppuku commented on 2018-10-28 13:23

I faced the same issue (the NoClassDefFoundError: org.slf4j/Logger) and adding the missing jars does not make Spark usable anyway (the run-master or start-master scripts still fail to launch Spark correctly). After digging up a bit more, It seems that Spark still relies on Hadoop even when using spark-2.3.1-bin-without-hadoop.tgz archive. Switching to the Spark archive built for hadoop 2.7, all the needed jars for Spark are indeed installed along with Spark.

Ichimonji10 commented on 2018-10-23 02:13

I don't have hadoop installed, and I also get an error about org.slf4j.Logger. Here's a snippet of the error:

$ spark-shell                                                                                                        
Error: A JNI error has occurred, please check your installation and try again                                        
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger                                          
    [snip]
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger                                                        
    [snip]

Of interest is that installing the slf4j AUR package doesn't solve the issue.

arch_nomad commented on 2018-06-17 10:00

I'm new to this whole arch AUR scene so sorry for the request because otherwise I would do it my self if I knew how to.

Please update the spark version to 2.3.1.

Thanks in advance

wasperen commented on 2018-03-31 15:59

One needs to install hadoop from the AUR to run this, I think. Without it, it complains about missing org.slf4j.Logger.

lukeyeager commented on 2018-03-06 23:36

$ pyspark

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 52.0

EDIT Works with jre8-openjdk-headless

huitseeker commented on 2017-10-10 20:00

@pdxleif Thanks a lot, fixed!

pdxleif commented on 2017-10-04 00:56

Also, spark-shell doesn't work with Java 9.

pdxleif commented on 2017-10-04 00:33

Running `spark-shell` gives:
/usr/bin/spark-shell: line 32: /usr/bin/find-spark-home: No such file or directory
As /usr/bin/spark-shell is just a symlink to /opt/apache-spark/bin/spark-shell, but the script looks for `find-spark-home` in its current directory.
Also, once you get by that error, it fails with:
/opt/apache-spark/conf/spark-env.sh: line 4: hadoop: command not found
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
The `hadoop` package is required to use spark-shell?

huitseeker commented on 2017-08-08 18:04

@hiliev thanks, fixed