Package Details: apache-spark 3.5.0-1

Git Clone URL: https://aur.archlinux.org/apache-spark.git (read-only, click to copy)
Package Base: apache-spark
Description: A unified analytics engine for large-scale data processing
Upstream URL: http://spark.apache.org
Keywords: spark
Licenses: Apache
Submitter: huitseeker
Maintainer: ttc0419
Last Packager: ttc0419
Votes: 57
Popularity: 0.020663
First Submitted: 2015-10-04 09:31 (UTC)
Last Updated: 2023-09-29 13:49 (UTC)

Dependencies (2)

Required by (2)

Sources (4)

Latest Comments

« First ‹ Previous 1 2 3 4 5 6 7 Next › Last »

apetresc commented on 2019-01-30 02:44 (UTC)

The package is so out of date that the mirror no longer carries the source package corresponding to 2.3.1, so it fails to build now.

If 2.4.0 still requires additional testing that you don't have time to do yet, then please at least bump to version 2.3.2 which is a drop-in replacement. The patch is trivial.

MangoMan commented on 2019-01-14 09:31 (UTC)

I concur with @wenbushi that hadoop should be dependency since installing hadoop resolved the previous JNI errors.

wenbushi commented on 2018-12-02 01:55 (UTC) (edited on 2018-12-02 02:35 (UTC) by wenbushi)

I've also got the missing org/slf4j/Logger error even after installing the slf4j AUR package manually. Maybe it should be specified that hadoop is a required dependency.

S3ppuku commented on 2018-10-28 13:23 (UTC)

I faced the same issue (the NoClassDefFoundError: org.slf4j/Logger) and adding the missing jars does not make Spark usable anyway (the run-master or start-master scripts still fail to launch Spark correctly). After digging up a bit more, It seems that Spark still relies on Hadoop even when using spark-2.3.1-bin-without-hadoop.tgz archive. Switching to the Spark archive built for hadoop 2.7, all the needed jars for Spark are indeed installed along with Spark.

jaudet commented on 2018-10-23 02:13 (UTC) (edited on 2018-10-23 02:15 (UTC) by jaudet)

I don't have hadoop installed, and I also get an error about org.slf4j.Logger. Here's a snippet of the error:

$ spark-shell                                                                                                        
Error: A JNI error has occurred, please check your installation and try again                                        
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger                                          
    [snip]
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger                                                        
    [snip]

Of interest is that installing the slf4j AUR package doesn't solve the issue.

arch_nomad commented on 2018-06-17 10:00 (UTC) (edited on 2018-06-17 10:01 (UTC) by arch_nomad)

I'm new to this whole arch AUR scene so sorry for the request because otherwise I would do it my self if I knew how to.

Please update the spark version to 2.3.1.

Thanks in advance

wasperen commented on 2018-03-31 15:59 (UTC)

One needs to install hadoop from the AUR to run this, I think. Without it, it complains about missing org.slf4j.Logger.

lukeyeager commented on 2018-03-06 23:36 (UTC) (edited on 2018-03-07 00:02 (UTC) by lukeyeager)

$ pyspark

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 52.0

EDIT Works with jre8-openjdk-headless

huitseeker commented on 2017-10-10 20:00 (UTC)

@pdxleif Thanks a lot, fixed!

pdxleif commented on 2017-10-04 00:56 (UTC)

Also, spark-shell doesn't work with Java 9.