Package Details: apache-spark 3.5.1-1

Git Clone URL: https://aur.archlinux.org/apache-spark.git (read-only, click to copy)
Package Base: apache-spark
Description: A unified analytics engine for large-scale data processing
Upstream URL: http://spark.apache.org
Keywords: spark
Licenses: Apache
Submitter: huitseeker
Maintainer: aakashhemadri
Last Packager: aakashhemadri
Votes: 57
Popularity: 0.014919
First Submitted: 2015-10-04 09:31 (UTC)
Last Updated: 2024-05-07 17:40 (UTC)

Dependencies (2)

Required by (2)

Sources (4)

Latest Comments

« First ‹ Previous 1 2 3 4 5 6 7 Next › Last »

pdxleif commented on 2017-10-04 00:33 (UTC)

Running `spark-shell` gives: /usr/bin/spark-shell: line 32: /usr/bin/find-spark-home: No such file or directory As /usr/bin/spark-shell is just a symlink to /opt/apache-spark/bin/spark-shell, but the script looks for `find-spark-home` in its current directory. Also, once you get by that error, it fails with: /opt/apache-spark/conf/spark-env.sh: line 4: hadoop: command not found Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream The `hadoop` package is required to use spark-shell?

huitseeker commented on 2017-08-08 18:04 (UTC)

@hiliev thanks, fixed

hiliev commented on 2017-08-08 10:33 (UTC) (edited on 2017-08-08 10:33 (UTC) by hiliev)

You have forgotten to update the SHA1 checksum of spark-env.sh in PKGBUILD.

cippaciong commented on 2017-07-13 21:39 (UTC)

@brk0_0: I had the same issue, installing hadoop solved it. I don't know if there are better solutions

brk0_0 commented on 2017-07-04 02:20 (UTC) (edited on 2017-07-04 02:51 (UTC) by brk0_0)

After installing it successfully, I get the following error: $ pyspark /usr/bin/pyspark: line 21: /usr/bin/find-spark-home: No such file or directory /bin/load-spark-env.sh: line 26: /usr/bin/find-spark-home: No such file or directory /bin/spark-submit: line 21: /bin/find-spark-home: No such file or directory /bin/spark-class: line 21: /bin/find-spark-home: No such file or directory /bin/load-spark-env.sh: line 26: /bin/find-spark-home: No such file or directory Failed to find Spark jars directory (/assembly/target/scala-2.10/jars). You need to build Spark with the target "package" before running this program. Same thing occurs when trying spark-submit and others

adouzzy commented on 2017-01-16 22:29 (UTC)

Please update to 2.1.0. Cheers

huitseeker commented on 2016-12-29 18:40 (UTC)

@steph.schie updated!

steph.schie commented on 2016-12-27 14:33 (UTC)

Why is hadoop a dependency? I don't need and want to use hadoop with spark.

huitseeker commented on 2016-11-18 07:45 (UTC)

There should be a file /etc/profile.d/apache-spark.sh that sets the $SPARK_HOME env variable for you correctly. If this is set up right, you should need no patching of binaries. To test that assumption, first check the value of $SPARK_HOME (should be /opt/apache-spark) and then run sh /opt/apache-spark/bin/load-spark-env.sh. Report (with as much complete information as possible) if you see an error then.

TaXules commented on 2016-11-08 16:30 (UTC)

To fix the error "ls: cannot access '/usr/assembly/target/scala-2.10': No such …", you must patch spark bins by running: "sed -i 's/`dirname "$0"`/`dirname "$(readlink -f $0)"`/g' /opt/apache-spark/bin/*" (readlink is in coreutils)