Package Details: apache-spark 2.3.0-1

Git Clone URL: https://aur.archlinux.org/apache-spark.git (read-only)
Package Base: apache-spark
Description: fast and general engine for large-scale data processing
Upstream URL: http://spark.apache.org
Licenses: Apache
Submitter: huitseeker
Maintainer: huitseeker
Last Packager: huitseeker
Votes: 35
Popularity: 2.616920
First Submitted: 2015-10-04 09:31
Last Updated: 2018-04-06 06:17

Latest Comments

arch_nomad commented on 2018-06-17 10:00

I'm new to this whole arch AUR scene so sorry for the request because otherwise I would do it my self if I knew how to.

Please update the spark version to 2.3.1.

Thanks in advance

wasperen commented on 2018-03-31 15:59

One needs to install hadoop from the AUR to run this, I think. Without it, it complains about missing org.slf4j.Logger.

lukeyeager commented on 2018-03-06 23:36

$ pyspark

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 52.0

EDIT Works with jre8-openjdk-headless

huitseeker commented on 2017-10-10 20:00

@pdxleif Thanks a lot, fixed!

pdxleif commented on 2017-10-04 00:56

Also, spark-shell doesn't work with Java 9.

pdxleif commented on 2017-10-04 00:33

Running `spark-shell` gives:
/usr/bin/spark-shell: line 32: /usr/bin/find-spark-home: No such file or directory
As /usr/bin/spark-shell is just a symlink to /opt/apache-spark/bin/spark-shell, but the script looks for `find-spark-home` in its current directory.
Also, once you get by that error, it fails with:
/opt/apache-spark/conf/spark-env.sh: line 4: hadoop: command not found
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
The `hadoop` package is required to use spark-shell?

huitseeker commented on 2017-08-08 18:04

@hiliev thanks, fixed

hiliev commented on 2017-08-08 10:33

You have forgotten to update the SHA1 checksum of spark-env.sh in PKGBUILD.

cippaciong commented on 2017-07-13 21:39

@brk0_0: I had the same issue, installing hadoop solved it. I don't know if there are better solutions

brk0_0 commented on 2017-07-04 02:20

After installing it successfully, I get the following error:

$ pyspark
/usr/bin/pyspark: line 21: /usr/bin/find-spark-home: No such file or directory
/bin/load-spark-env.sh: line 26: /usr/bin/find-spark-home: No such file or directory
/bin/spark-submit: line 21: /bin/find-spark-home: No such file or directory
/bin/spark-class: line 21: /bin/find-spark-home: No such file or directory
/bin/load-spark-env.sh: line 26: /bin/find-spark-home: No such file or directory
Failed to find Spark jars directory (/assembly/target/scala-2.10/jars).
You need to build Spark with the target "package" before running this program.

Same thing occurs when trying spark-submit and others

All comments