Package Details: apache-spark 3.5.1-1

Git Clone URL: https://aur.archlinux.org/apache-spark.git (read-only, click to copy)
Package Base: apache-spark
Description: A unified analytics engine for large-scale data processing
Upstream URL: http://spark.apache.org
Keywords: spark
Licenses: Apache
Submitter: huitseeker
Maintainer: aakashhemadri
Last Packager: aakashhemadri
Votes: 57
Popularity: 0.000050
First Submitted: 2015-10-04 09:31 (UTC)
Last Updated: 2024-05-07 17:40 (UTC)

Dependencies (2)

Required by (2)

Sources (4)

Latest Comments

« First ‹ Previous 1 2 3 4 5 6 7 Next › Last »

discord commented on 2016-04-16 04:13 (UTC)

Could my local .m2 settings be getting in the way? makepkg -s ==> Making package: apache-spark 1.6.1-1 (Sat Apr 16 04:04:59 UTC 2016) ==> Checking runtime dependencies... ==> Checking buildtime dependencies... ==> Retrieving sources... -> Downloading spark-1.6.1.tgz... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 11.6M 100 11.6M 0 0 5375k 0 0:00:02 0:00:02 --:--:-- 5376k -> Found apache-spark-standalone.service -> Found spark-env.sh ==> Validating source files with md5sums... spark-1.6.1.tgz ... Passed apache-spark-standalone.service ... Passed spark-env.sh ... Passed ==> Extracting sources... -> Extracting spark-1.6.1.tgz with bsdtar ==> Starting prepare()... ==> Starting build()... dev/../assembly/pom.xml dev/../external/mqtt/pom.xml dev/../external/zeromq/pom.xml dev/../external/mqtt-assembly/pom.xml dev/../external/twitter/pom.xml dev/../external/flume-sink/pom.xml dev/../external/kafka-assembly/pom.xml dev/../external/flume-assembly/pom.xml dev/../external/kafka/pom.xml dev/../external/flume/pom.xml dev/../bagel/pom.xml dev/../mllib/pom.xml dev/../launcher/pom.xml dev/../sql/catalyst/pom.xml dev/../sql/hive-thriftserver/pom.xml dev/../sql/hive/pom.xml dev/../sql/core/pom.xml dev/../network/common/pom.xml dev/../network/shuffle/pom.xml dev/../network/yarn/pom.xml dev/../graphx/pom.xml dev/../tags/pom.xml dev/../examples/pom.xml dev/../streaming/pom.xml dev/../tools/pom.xml dev/../extras/kinesis-asl-assembly/pom.xml dev/../extras/java8-tests/pom.xml dev/../extras/kinesis-asl/pom.xml dev/../extras/spark-ganglia-lgpl/pom.xml dev/../unsafe/pom.xml dev/../docker-integration-tests/pom.xml dev/../dev/audit-release/blank_maven_build/pom.xml dev/../dev/audit-release/maven_app_core/pom.xml dev/../yarn/pom.xml dev/../pom.xml dev/../repl/pom.xml dev/../core/pom.xml dev/../docs/_plugins/copy_api_dirs.rb +++ dirname ./make-distribution.sh ++ cd . ++ pwd + SPARK_HOME=/home/colin/build/AUR/apache-spark/src/spark-1.6.1 + DISTDIR=/home/colin/build/AUR/apache-spark/src/spark-1.6.1/dist + SPARK_TACHYON=false + TACHYON_VERSION=0.8.2 + TACHYON_TGZ=tachyon-0.8.2-bin.tar.gz + TACHYON_URL=http://tachyon-project.org/downloads/files/0.8.2/tachyon-0.8.2-bin.tar.gz + MAKE_TGZ=false + NAME=none + MVN=/home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn + (( 9 )) + case $1 in + break + '[' -z /usr/lib/jvm/default-runtime ']' + '[' -z /usr/lib/jvm/default-runtime ']' ++ command -v git + '[' /usr/bin/git ']' ++ git rev-parse --short HEAD ++ : + GITREV= + '[' '!' -z '' ']' + unset GITREV ++ command -v /home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn + '[' '!' /home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn ']' ++ /home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn help:evaluate -Dexpression=project.version -Pscala-2.11 -DskipTests -Dmaven.repo.local=/tmp -DautoVersionSubmodules=true -U -Djline.version=2.13 -Djline.groupid=jline -Pyarn -Phadoop-2.6 ++ grep -v INFO ++ tail -n 1 + VERSION='[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/UnresolvableModelException' ==> ERROR: A failure occurred in build(). Aborting...

Knight commented on 2016-03-12 10:25 (UTC)

@huitseeker, sorry for the delay. I don't know why but it's weird that the installation succeeded this time...

huitseeker commented on 2016-03-09 08:11 (UTC)

Hi @Knight, The exact nature of the error is above the maven output you paste here. without it, it's difficult to interpret this message.

Knight commented on 2016-03-09 07:12 (UTC)

[INFO] Spark Project Parent POM ........................... SUCCESS [04:12 min] [INFO] Spark Project Test Tags ............................ SUCCESS [ 58.838 s] [INFO] Spark Project Launcher ............................. SUCCESS [01:34 min] [INFO] Spark Project Networking ........................... SUCCESS [ 31.249 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 36.530 s] [INFO] Spark Project Unsafe ............................... SUCCESS [01:00 min] [INFO] Spark Project Core ................................. FAILURE [04:08 min] SKIP... [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 13:05 min [INFO] Finished at: 2016-03-09T15:04:30+08:00 [INFO] Final Memory: 59M/735M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-core_2.11: Exe cution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> [Help 1] [ERROR] what should I do?

Aetf commented on 2016-02-23 04:43 (UTC)

The the package function should be @@ -55,7 +55,7 @@ package() { mkdir -p $pkgdir/etc/profile.d echo '#!/bin/sh' > $pkgdir/etc/profile.d/apache-spark.sh - echo 'SPARK_HOME=$pkgdir/usr/share/apache-spark' >> $pkgdir/etc/profile.d/apache-spark.sh + echo 'SPARK_HOME=/usr/share/apache-spark' >> $pkgdir/etc/profile.d/apache-spark.sh echo 'export SPARK_HOME' >> $pkgdir/etc/profile.d/apache-spark.sh chmod 755 $pkgdir/etc/profile.d/apache-spark.sh

aftabnack commented on 2016-02-17 07:35 (UTC) (edited on 2016-02-17 07:38 (UTC) by aftabnack)

My compiling is failed at the point of building MQTT, every time its running out of memory. I'm running arch 32-bit in virtual box with 5GB RAM. It failed with outOfMemory error with message saying "malloc failed to allocate 171128 bytes for chunk::new" I have run it about 8times. I tried modifying PKGBUILD and editing MAVEN_OPTS with different values for -Xmx MaxPermSize. I can share the hs_err/replay file.

oneeyed commented on 2016-01-07 18:38 (UTC)

Adding two files in /etc/profile.d (sh and csh) to set SPARK_HOME would definitely help users of this package.

StefanK2 commented on 2016-01-05 14:46 (UTC) (edited on 2016-01-05 15:09 (UTC) by StefanK2)

LOL I found the problem, SPARK_HOME wasn't set ;-) Everything is working now. --- I just updated the package, but now I get this error-message when I call spark-submit or spark-shell: ls: cannot access /usr/assembly/target/scala-2.10: No such file or directory Failed to find Spark assembly in /usr/assembly/target/scala-2.10. You need to build Spark before running this program. According to pacman scala 2.11 is installed on my machine. Also when I checked in apache-spark/src/spark-1.6.0/assembly/target there is only a folder scala-2.11. Did I miss some configuration-step or something? I changed the line 36 in PKGBUILD from dev/change-version-to-2.11.sh to dev/change-scala-version.sh 2.11 to get rid of the deprecation warning (I assumed that my problem is related to the warning, but it didn't change it). Also I just found this: https://issues.apache.org/jira/browse/SPARK-7074 seems to be a spark problem.

huitseeker commented on 2015-12-21 14:44 (UTC)

@fosskers I really can't reproduce your issue. Do you have more info on your config ? Have you thought of clearing your maven cache before rebuilding ?