Package Details: apache-spark 3.5.1-1

Git Clone URL: https://aur.archlinux.org/apache-spark.git (read-only, click to copy)
Package Base: apache-spark
Description: A unified analytics engine for large-scale data processing
Upstream URL: http://spark.apache.org
Keywords: spark
Licenses: Apache
Submitter: huitseeker
Maintainer: aakashhemadri
Last Packager: aakashhemadri
Votes: 57
Popularity: 0.000050
First Submitted: 2015-10-04 09:31 (UTC)
Last Updated: 2024-05-07 17:40 (UTC)

Dependencies (2)

Required by (2)

Sources (4)

Latest Comments

« First ‹ Previous 1 2 3 4 5 6 7 Next › Last »

sabitmaulanaa commented on 2019-05-27 14:23 (UTC)

is this package will be updated?

lukaszimmermann commented on 2019-04-26 09:29 (UTC)

Works, thanks!

blurbdust commented on 2019-04-26 00:31 (UTC)

Here is my current PKGBUILD that works for the latest spark as on 4/23/19

https://gist.github.com/blurbdust/3e03531d890c8bcf1da3c3d1192ce4d5

lukaszimmermann commented on 2019-04-25 09:14 (UTC)

I would be happy to become maintainer of this package, then I can take care of some open issues here.

apetresc commented on 2019-01-30 02:44 (UTC)

The package is so out of date that the mirror no longer carries the source package corresponding to 2.3.1, so it fails to build now.

If 2.4.0 still requires additional testing that you don't have time to do yet, then please at least bump to version 2.3.2 which is a drop-in replacement. The patch is trivial.

MangoMan commented on 2019-01-14 09:31 (UTC)

I concur with @wenbushi that hadoop should be dependency since installing hadoop resolved the previous JNI errors.

wenbushi commented on 2018-12-02 01:55 (UTC) (edited on 2018-12-02 02:35 (UTC) by wenbushi)

I've also got the missing org/slf4j/Logger error even after installing the slf4j AUR package manually. Maybe it should be specified that hadoop is a required dependency.

S3ppuku commented on 2018-10-28 13:23 (UTC)

I faced the same issue (the NoClassDefFoundError: org.slf4j/Logger) and adding the missing jars does not make Spark usable anyway (the run-master or start-master scripts still fail to launch Spark correctly). After digging up a bit more, It seems that Spark still relies on Hadoop even when using spark-2.3.1-bin-without-hadoop.tgz archive. Switching to the Spark archive built for hadoop 2.7, all the needed jars for Spark are indeed installed along with Spark.

jaudet commented on 2018-10-23 02:13 (UTC) (edited on 2018-10-23 02:15 (UTC) by jaudet)

I don't have hadoop installed, and I also get an error about org.slf4j.Logger. Here's a snippet of the error:

$ spark-shell                                                                                                        
Error: A JNI error has occurred, please check your installation and try again                                        
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger                                          
    [snip]
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger                                                        
    [snip]

Of interest is that installing the slf4j AUR package doesn't solve the issue.

arch_nomad commented on 2018-06-17 10:00 (UTC) (edited on 2018-06-17 10:01 (UTC) by arch_nomad)

I'm new to this whole arch AUR scene so sorry for the request because otherwise I would do it my self if I knew how to.

Please update the spark version to 2.3.1.

Thanks in advance