Package Details: apache-spark 3.2.0-0

Git Clone URL: https://aur.archlinux.org/apache-spark.git (read-only, click to copy)
Package Base: apache-spark
Description: Apache Spark is a unified analytics engine for large-scale data processing.
Upstream URL: http://spark.apache.org
Keywords: spark
Licenses: Apache
Submitter: huitseeker
Maintainer: lukaszimmermann (emanuelfontelles)
Last Packager: emanuelfontelles
Votes: 50
Popularity: 0.066688
First Submitted: 2015-10-04 09:31 (UTC)
Last Updated: 2021-12-08 02:04 (UTC)

Dependencies (9)

Required by (1)

Sources (7)

Latest Comments

dmfay commented on 2020-04-22 20:13 (UTC)

For 2.4.5 (also fixes the worker unit description):

diff --git a/PKGBUILD b/PKGBUILD
index 54ec365..14d1180 100644
--- a/PKGBUILD
+++ b/PKGBUILD
@@ -3,7 +3,7 @@
 # Contributor: Emanuel Fontelles ("emanuelfontelles") <emanuelfontelles@hotmail.com>

 pkgname=apache-spark
-pkgver=2.4.4
+pkgver=2.4.5
 pkgrel=1
 pkgdesc="fast and general engine for large-scale data processing"
 arch=('any')
@@ -26,7 +26,7 @@ source=("https://archive.apache.org/dist/spark/spark-${pkgver}/spark-${pkgver}-b
         'spark-daemon-run.sh'
         'run-master.sh'
         'run-slave.sh')
-sha1sums=('53f99ba8c5a68c941dd17d45393a6040dd0b46c8'
+sha1sums=('338756ea89c2d15985ee24b46cec21bf9c7f2622'
           'ac71d12070a9a10323e8ec5aed4346b1dd7f21c6'
           'a191e4f8f7f8bbc596f4fadfb3c592c3efbc4fc0'
           '3fa39d55075d4728bd447692d648053c9f6b07ec'
diff --git a/apache-spark-slave@.service b/apache-spark-slave@.service
index 453b346..a90e866 100644
--- a/apache-spark-slave@.service
+++ b/apache-spark-slave@.service
@@ -1,5 +1,5 @@
 [Unit]
-Description=Apache Spark Standalone Master
+Description=Apache Spark Worker
 After=network.target

 [Service]

ryukinix commented on 2020-01-17 23:12 (UTC) (edited on 2020-01-17 23:13 (UTC) by ryukinix)

Updating to spark 3.0.0-preview2 it will make works with Python3.8. I'm using this modified version of PKGBUILD: https://github.com/ryukinix/apache-spark-pkgbuild. Until now it's working fine.

5 days ago it was released the v2.4.5 github tag, but is not available yet on apache spark archive the compiled version, for that reason I used 3.0.0 preview2 version which is the most recent version available in the archive.

v2.4.5 and v3.0.0 versions contains the commit that fix the problems with python3.8: https://github.com/apache/spark/commit/811d563fbf60203377e8462e4fad271c1140b4fa

YaLTeR commented on 2019-12-13 08:25 (UTC) (edited on 2019-12-13 08:45 (UTC) by YaLTeR)

Doesn't seem to work with Python 3 (the default)?

└─ pyspark
Picked up _JAVA_OPTIONS: -Dawt.useSystemAAFontSettings=on -Dswing.aatext=true
Python 3.8.0 (default, Oct 23 2019, 18:51:26)
[GCC 9.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last):
  File "/opt/apache-spark/python/pyspark/shell.py", line 31, in <module>
    from pyspark import SparkConf
  File "/opt/apache-spark/python/pyspark/__init__.py", line 51, in <module>
    from pyspark.context import SparkContext
  File "/opt/apache-spark/python/pyspark/context.py", line 31, in <module>
    from pyspark import accumulators
  File "/opt/apache-spark/python/pyspark/accumulators.py", line 97, in <module>
    from pyspark.serializers import read_int, PickleSerializer
  File "/opt/apache-spark/python/pyspark/serializers.py", line 71, in <module>
    from pyspark import cloudpickle
  File "/opt/apache-spark/python/pyspark/cloudpickle.py", line 145, in <module>
    _cell_set_template_code = _make_cell_set_template_code()
  File "/opt/apache-spark/python/pyspark/cloudpickle.py", line 126, in _make_cell_set_template_code
    return types.CodeType(
TypeError: an integer is required (got type bytes)

Update: looks like it's Python 3.8; installing python37 from the AUR and using that seems to work.

tir commented on 2019-07-17 10:35 (UTC) (edited on 2019-07-17 10:39 (UTC) by tir)

I had success with the following PKGBUILD (patch: https://git.io/fj1CT).

# Maintainer: François Garillot ("huitseeker") <francois [at] garillot.net>
# Contributor: Christian Krause ("wookietreiber") <kizkizzbangbang@gmail.com>

pkgname=apache-spark
pkgver=2.4.3
pkgrel=1
pkgdesc="fast and general engine for large-scale data processing"
arch=('any')
url="http://spark.apache.org"
license=('APACHE')
depends=('java-environment>=6' 'java-environment<9')
optdepends=('python2: python2 support for pyspark'
            'ipython2: ipython2 support for pyspark'
            'python: python3 support for pyspark'
            'ipython: ipython3 support for pyspark'
            'r: support for sparkR'
            'rsync: support rsync hadoop binaries from master'
            'hadoop: support for running on YARN')
install=apache-spark.install
source=("https://www.apache.org/dyn/mirrors/mirrors.cgi?action=download&filename=spark/spark-${pkgver}/spark-${pkgver}-bin-without-hadoop.tgz"
        'apache-spark-master.service'
        'apache-spark-slave@.service'
        'spark-env.sh'
        'spark-daemon-run.sh'
        'run-master.sh'
        'run-slave.sh')
sha1sums=('54bf6a19eb832dc0cf2d7a7465b785390d00122b'
          'ac71d12070a9a10323e8ec5aed4346b1dd7f21c6'
          'a191e4f8f7f8bbc596f4fadfb3c592c3efbc4fc0'
          '3fa39d55075d4728bd447692d648053c9f6b07ec'
          '08557d2d5328d5c99e533e16366fd893fffaad78'
          '323445b8d64aea0534a2213d2600d438f406855b'
          '65b1bc5fce63d1fa7a1b90f2d54a09acf62012a4')
backup=('etc/apache-spark/spark-env.sh')

PKGEXT=${PKGEXT:-'.pkg.tar.xz'}

prepare() {
  cd "$srcdir/spark-${pkgver}-bin-without-hadoop"
}

package() {
        cd "$srcdir/spark-${pkgver}-bin-without-hadoop"

        install -d "$pkgdir/usr/bin" "$pkgdir/opt" "$pkgdir/var/log/apache-spark" "$pkgdir/var/lib/apache-spark/work"
        chmod 2775 "$pkgdir/var/log/apache-spark" "$pkgdir/var/lib/apache-spark/work"

        cp -r "$srcdir/spark-${pkgver}-bin-without-hadoop" "$pkgdir/opt/apache-spark/"

        cd "$pkgdir/usr/bin"
        for binary in beeline pyspark sparkR spark-class spark-shell find-spark-home spark-sql spark-submit load-spark-env.sh; do
                binpath="/opt/apache-spark/bin/$binary"
                ln -s "$binpath" $binary
                sed -i 's|^export SPARK_HOME=.*$|export SPARK_HOME=/opt/apache-spark|' "$pkgdir/$binpath"
                sed -i -Ee 's/\$\(dirname "\$0"\)/$(dirname "$(readlink -f "$0")")/g' "$pkgdir/$binpath"
        done

        mkdir -p $pkgdir/etc/profile.d
        echo '#!/bin/sh' > $pkgdir/etc/profile.d/apache-spark.sh
        echo 'SPARK_HOME=/opt/apache-spark' >> $pkgdir/etc/profile.d/apache-spark.sh
        echo 'export SPARK_HOME' >> $pkgdir/etc/profile.d/apache-spark.sh
        chmod 755 $pkgdir/etc/profile.d/apache-spark.sh

        install -Dm644 "$srcdir/apache-spark-master.service" "$pkgdir/usr/lib/systemd/system/apache-spark-master.service"
        install -Dm644 "$srcdir/apache-spark-slave@.service" "$pkgdir/usr/lib/systemd/system/apache-spark-slave@.service"
        install -Dm644 "$srcdir/spark-env.sh" "$pkgdir/etc/apache-spark/spark-env.sh"
        for script in run-master.sh run-slave.sh spark-daemon-run.sh; do
            install -Dm755 "$srcdir/$script" "$pkgdir/opt/apache-spark/sbin/$script"
        done
        install -Dm644 "$srcdir/spark-${pkgver}-bin-without-hadoop/conf"/* "$pkgdir/etc/apache-spark"

        cd "$pkgdir/opt/apache-spark"
        mv conf conf-templates
        ln -sf "/etc/apache-spark" conf
        ln -sf "/var/lib/apache-spark/work" .
}

There is another, unrelated issue probably with JLine (as noted here: https://github.com/sanori/spark-sbt/issues/4#issuecomment-401621777), for which the workaround would be:

$ TERM=xterm-color spark-shell
...
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.3
      /_/

Using Scala version 2.11.12 (OpenJDK 64-Bit Server VM, Java 10.0.2)
Type in expressions to have them evaluated.
Type :help for more information.

lukaszimmermann commented on 2019-07-15 09:05 (UTC)

Hi, sorry for being unresponsive the last couple of month. Emanuel I added you as co-maintainer, can you perhaps update this package with the new PKGBUILD for Spark 2.4.3.

emanuelfontelles commented on 2019-07-13 17:56 (UTC) (edited on 2019-07-13 17:57 (UTC) by emanuelfontelles)

Hey guys, here are a full PKGBUILD with Spark-2.4.3, with Hadoop binaries, its totally works

# Maintainer: François Garillot ("huitseeker") <francois [at] garillot.net>
# Contributor: Christian Krause ("wookietreiber") <kizkizzbangbang@gmail.com>
# Contributor: Emanuel Fontelles ("emanuelfontelles") <emanuelfontelles@hotmail.com>

pkgname=apache-spark
pkgver=2.4.3
pkgrel=1
pkgdesc="fast and general engine for large-scale data processing"
arch=('any')
url="http://spark.apache.org"
license=('APACHE')
depends=('java-environment>=6' 'java-environment<9')
optdepends=('python2: python2 support for pyspark'
            'ipython2: ipython2 support for pyspark'
            'python: python3 support for pyspark'
            'ipython: ipython3 support for pyspark'
            'r: support for sparkR'
            'rsync: support rsync hadoop binaries from master'
            'hadoop: support for running on YARN')

install=apache-spark.install
source=("https://archive.apache.org/dist/spark/spark-${pkgver}/spark-${pkgver}-bin-hadoop2.7.tgz"
        'apache-spark-master.service'
        'apache-spark-slave@.service'
        'spark-env.sh'
        'spark-daemon-run.sh'
        'run-master.sh'
        'run-slave.sh')
sha1sums=('7b2f1be5c4ccec86c6d2b1e54c379b7af7a5752a'
          'ac71d12070a9a10323e8ec5aed4346b1dd7f21c6'
          'a191e4f8f7f8bbc596f4fadfb3c592c3efbc4fc0'
          '3fa39d55075d4728bd447692d648053c9f6b07ec'
          '08557d2d5328d5c99e533e16366fd893fffaad78'
          '323445b8d64aea0534a2213d2600d438f406855b'
          '65b1bc5fce63d1fa7a1b90f2d54a09acf62012a4')
backup=('etc/apache-spark/spark-env.sh')

PKGEXT=${PKGEXT:-'.pkg.tar.xz'}

prepare() {
  cd "$srcdir/spark-${pkgver}-bin-hadoop2.7"
}

package() {
        cd "$srcdir/spark-${pkgver}-bin-hadoop2.7"

        install -d "$pkgdir/usr/bin" "$pkgdir/opt" "$pkgdir/var/log/apache-spark" "$pkgdir/var/lib/apache-spark/work"
        chmod 2775 "$pkgdir/var/log/apache-spark" "$pkgdir/var/lib/apache-spark/work"

        cp -r "$srcdir/spark-${pkgver}-bin-hadoop2.7" "$pkgdir/opt/apache-spark/"

        cd "$pkgdir/usr/bin"
        for binary in beeline pyspark sparkR spark-class spark-shell find-spark-home spark-sql spark-submit load-spark-env.sh; do
                binpath="/opt/apache-spark/bin/$binary"
                ln -s "$binpath" $binary
                sed -i 's|^export SPARK_HOME=.*$|export SPARK_HOME=/opt/apache-spark|' "$pkgdir/$binpath"
                sed -i -Ee 's/\$\(dirname "\$0"\)/$(dirname "$(readlink -f "$0")")/g' "$pkgdir/$binpath"
        done

        mkdir -p $pkgdir/etc/profile.d
        echo '#!/bin/sh' > $pkgdir/etc/profile.d/apache-spark.sh
        echo 'SPARK_HOME=/opt/apache-spark' >> $pkgdir/etc/profile.d/apache-spark.sh
        echo 'export SPARK_HOME' >> $pkgdir/etc/profile.d/apache-spark.sh
        chmod 755 $pkgdir/etc/profile.d/apache-spark.sh

        install -Dm644 "$srcdir/apache-spark-master.service" "$pkgdir/usr/lib/systemd/system/apache-spark-master.service"
        install -Dm644 "$srcdir/apache-spark-slave@.service" "$pkgdir/usr/lib/systemd/system/apache-spark-slave@.service"
        install -Dm644 "$srcdir/spark-env.sh" "$pkgdir/etc/apache-spark/spark-env.sh"
        for script in run-master.sh run-slave.sh spark-daemon-run.sh; do
            install -Dm755 "$srcdir/$script" "$pkgdir/opt/apache-spark/sbin/$script"
        done
        install -Dm644 "$srcdir/spark-${pkgver}-bin-hadoop2.7/conf"/* "$pkgdir/etc/apache-spark"

        cd "$pkgdir/opt/apache-spark"
        mv conf conf-templates
        ln -sf "/etc/apache-spark" conf
        ln -sf "/var/lib/apache-spark/work" .
}

marcinn commented on 2019-06-14 20:46 (UTC) (edited on 2019-06-14 20:46 (UTC) by marcinn)

PKGBUILD patch for 2.4.3:

diff --git a/PKGBUILD b/PKGBUILD
index 3540d3e..e31fec9 100644
--- a/PKGBUILD
+++ b/PKGBUILD
@@ -2,7 +2,7 @@
 # Contributor: Christian Krause ("wookietreiber") <kizkizzbangbang@gmail.com>

 pkgname=apache-spark
-pkgver=2.4.0
+pkgver=2.4.3
 pkgrel=1
 pkgdesc="fast and general engine for large-scale data processing"
 arch=('any')
@@ -17,14 +17,14 @@ optdepends=('python2: python2 support for pyspark'
             'rsync: support rsync hadoop binaries from master'
             'hadoop: support for running on YARN')
 install=apache-spark.install
-source=("https://www.apache.org/dyn/mirrors/mirrors.cgi?action=download&filename=spark/spark-${pkgver}/spark-${pkgver}-bin-without-hadoop.tgz"
+source=("https://archive.apache.org/dist/spark/spark-${pkgver}/spark-${pkgver}-bin-without-hadoop.tgz"
         'apache-spark-master.service'
         'apache-spark-slave@.service'
         'spark-env.sh'
         'spark-daemon-run.sh'
         'run-master.sh'
         'run-slave.sh')
-sha1sums=('ce6fe98272b78a5c487d16f0f5f828908b65d7fe'
+sha1sums=('54bf6a19eb832dc0cf2d7a7465b785390d00122b'
           'ac71d12070a9a10323e8ec5aed4346b1dd7f21c6'
           'a191e4f8f7f8bbc596f4fadfb3c592c3efbc4fc0'
           '3fa39d55075d4728bd447692d648053c9f6b07ec'

sabitmaulanaa commented on 2019-05-27 14:23 (UTC)

is this package will be updated?

lukaszimmermann commented on 2019-04-26 09:29 (UTC)

Works, thanks!

blurbdust commented on 2019-04-26 00:31 (UTC)

Here is my current PKGBUILD that works for the latest spark as on 4/23/19

https://gist.github.com/blurbdust/3e03531d890c8bcf1da3c3d1192ce4d5

lukaszimmermann commented on 2019-04-25 09:14 (UTC)

I would be happy to become maintainer of this package, then I can take care of some open issues here.

apetresc commented on 2019-01-30 02:44 (UTC)

The package is so out of date that the mirror no longer carries the source package corresponding to 2.3.1, so it fails to build now.

If 2.4.0 still requires additional testing that you don't have time to do yet, then please at least bump to version 2.3.2 which is a drop-in replacement. The patch is trivial.

MangoMan commented on 2019-01-14 09:31 (UTC)

I concur with @wenbushi that hadoop should be dependency since installing hadoop resolved the previous JNI errors.

wenbushi commented on 2018-12-02 01:55 (UTC) (edited on 2018-12-02 02:35 (UTC) by wenbushi)

I've also got the missing org/slf4j/Logger error even after installing the slf4j AUR package manually. Maybe it should be specified that hadoop is a required dependency.

S3ppuku commented on 2018-10-28 13:23 (UTC)

I faced the same issue (the NoClassDefFoundError: org.slf4j/Logger) and adding the missing jars does not make Spark usable anyway (the run-master or start-master scripts still fail to launch Spark correctly). After digging up a bit more, It seems that Spark still relies on Hadoop even when using spark-2.3.1-bin-without-hadoop.tgz archive. Switching to the Spark archive built for hadoop 2.7, all the needed jars for Spark are indeed installed along with Spark.

jaudet commented on 2018-10-23 02:13 (UTC) (edited on 2018-10-23 02:15 (UTC) by jaudet)

I don't have hadoop installed, and I also get an error about org.slf4j.Logger. Here's a snippet of the error:

$ spark-shell                                                                                                        
Error: A JNI error has occurred, please check your installation and try again                                        
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger                                          
    [snip]
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger                                                        
    [snip]

Of interest is that installing the slf4j AUR package doesn't solve the issue.

arch_nomad commented on 2018-06-17 10:00 (UTC) (edited on 2018-06-17 10:01 (UTC) by arch_nomad)

I'm new to this whole arch AUR scene so sorry for the request because otherwise I would do it my self if I knew how to.

Please update the spark version to 2.3.1.

Thanks in advance

wasperen commented on 2018-03-31 15:59 (UTC)

One needs to install hadoop from the AUR to run this, I think. Without it, it complains about missing org.slf4j.Logger.

lukeyeager commented on 2018-03-06 23:36 (UTC) (edited on 2018-03-07 00:02 (UTC) by lukeyeager)

$ pyspark

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 52.0

EDIT Works with jre8-openjdk-headless

huitseeker commented on 2017-10-10 20:00 (UTC)

@pdxleif Thanks a lot, fixed!

pdxleif commented on 2017-10-04 00:56 (UTC)

Also, spark-shell doesn't work with Java 9.

pdxleif commented on 2017-10-04 00:33 (UTC)

Running `spark-shell` gives: /usr/bin/spark-shell: line 32: /usr/bin/find-spark-home: No such file or directory As /usr/bin/spark-shell is just a symlink to /opt/apache-spark/bin/spark-shell, but the script looks for `find-spark-home` in its current directory. Also, once you get by that error, it fails with: /opt/apache-spark/conf/spark-env.sh: line 4: hadoop: command not found Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream The `hadoop` package is required to use spark-shell?

huitseeker commented on 2017-08-08 18:04 (UTC)

@hiliev thanks, fixed

hiliev commented on 2017-08-08 10:33 (UTC) (edited on 2017-08-08 10:33 (UTC) by hiliev)

You have forgotten to update the SHA1 checksum of spark-env.sh in PKGBUILD.

cippaciong commented on 2017-07-13 21:39 (UTC)

@brk0_0: I had the same issue, installing hadoop solved it. I don't know if there are better solutions

brk0_0 commented on 2017-07-04 02:20 (UTC) (edited on 2017-07-04 02:51 (UTC) by brk0_0)

After installing it successfully, I get the following error: $ pyspark /usr/bin/pyspark: line 21: /usr/bin/find-spark-home: No such file or directory /bin/load-spark-env.sh: line 26: /usr/bin/find-spark-home: No such file or directory /bin/spark-submit: line 21: /bin/find-spark-home: No such file or directory /bin/spark-class: line 21: /bin/find-spark-home: No such file or directory /bin/load-spark-env.sh: line 26: /bin/find-spark-home: No such file or directory Failed to find Spark jars directory (/assembly/target/scala-2.10/jars). You need to build Spark with the target "package" before running this program. Same thing occurs when trying spark-submit and others

adouzzy commented on 2017-01-16 22:29 (UTC)

Please update to 2.1.0. Cheers

huitseeker commented on 2016-12-29 18:40 (UTC)

@steph.schie updated!

steph.schie commented on 2016-12-27 14:33 (UTC)

Why is hadoop a dependency? I don't need and want to use hadoop with spark.

huitseeker commented on 2016-11-18 07:45 (UTC)

There should be a file /etc/profile.d/apache-spark.sh that sets the $SPARK_HOME env variable for you correctly. If this is set up right, you should need no patching of binaries. To test that assumption, first check the value of $SPARK_HOME (should be /opt/apache-spark) and then run sh /opt/apache-spark/bin/load-spark-env.sh. Report (with as much complete information as possible) if you see an error then.

TaXules commented on 2016-11-08 16:30 (UTC)

To fix the error "ls: cannot access '/usr/assembly/target/scala-2.10': No such …", you must patch spark bins by running: "sed -i 's/`dirname "$0"`/`dirname "$(readlink -f $0)"`/g' /opt/apache-spark/bin/*" (readlink is in coreutils)

mtrokic commented on 2016-08-13 17:20 (UTC)

I think there are dependencies which are missing. After installing gcc-fortran and postgresql-libs I was able to compile successfully.

sidec commented on 2016-07-10 20:16 (UTC) (edited on 2016-07-10 20:19 (UTC) by sidec)

After successful makepkg -sri I fail to run spark-shell, I get this message insteed: ls: cannot access '/usr/assembly/target/scala-2.10': No such file or directory Failed to find Spark assembly in /usr/assembly/target/scala-2.10. You need to build Spark before running this program.

axelmagn commented on 2016-06-29 19:24 (UTC)

I can confirm discord's issue. I am having the same problem.

huitseeker commented on 2016-05-03 15:40 (UTC)

I have trouble reproducing your issue, sorry.

discord commented on 2016-04-26 18:31 (UTC)

Considering removing hive from the pom build and re-building, since I don't use it. However not sure why this works for anyone except myself.

discord commented on 2016-04-16 21:11 (UTC)

I resolved the previous issue by connecting to my maven repository. However I'm having build issues now: [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-hive_2.11 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] Copying 3 resources [INFO] [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-hive_2.11 --- [WARNING] Failed to build parent project for org.spark-project.hive:hive-cli:jar:1.2.1.spark [WARNING] Invalid POM for org.spark-project.hive:hive-cli:jar:1.2.1.spark, transitive dependencies (if any) will not be available, enable debug logging for more details [WARNING] Failed to build parent project for org.spark-project.hive:hive-exec:jar:1.2.1.spark [WARNING] Invalid POM for org.spark-project.hive:hive-exec:jar:1.2.1.spark, transitive dependencies (if any) will not be available, enable debug logging for more details [WARNING] Failed to build parent project for org.spark-project.hive:hive-metastore:jar:1.2.1.spark [WARNING] Invalid POM for org.spark-project.hive:hive-metastore:jar:1.2.1.spark, transitive dependencies (if any) will not be available, enable debug logging for more details [INFO] Using zinc server for incremental compilation [info] Compiling 28 Scala sources and 1 Java source to /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/target/scala-2.11/classes... [warn] Class org.antlr.runtime.tree.CommonTree not found - continuing with a stub. [warn] Class org.antlr.runtime.Token not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.Tree not found - continuing with a stub. [error] Class org.antlr.runtime.tree.CommonTree not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. .... [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.IntStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [error] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.CommonTree not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. ... [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.IntStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [error] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. ... [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. ... [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.IntStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [error] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.IntStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [error] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:221: value token is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] n.token.asInstanceOf[org.antlr.runtime.CommonToken].setText(newText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:221: object antlr is not a member of package org [error] n.token.asInstanceOf[org.antlr.runtime.CommonToken].setText(newText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:229: value getChildCount is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] (1 to n.getChildCount).foreach(_ => n.deleteChild(0)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:229: value deleteChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] (1 to n.getChildCount).foreach(_ => n.deleteChild(0)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:230: value addChildren is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] n.addChildren(newChildren.asJava) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:246: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] check("type", _.getType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:247: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] check("text", _.getText) [error] ^ [warn] Class org.antlr.runtime.TokenRewriteStream not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenRewriteStream not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:304: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val plan = if (nativeCommands contains tree.getText) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:345: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] assert(tree.asInstanceOf[ASTNode].getText == "TOK_CREATETABLE", "Only CREATE TABLE supported.") [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:349: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] .find(_.asInstanceOf[ASTNode].getText == "TOK_TABCOLLIST") [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:360: value getLine is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] CurrentOrigin.setPosition(t.getLine, t.getCharPositionInLine) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:360: value getCharPositionInLine is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] CurrentOrigin.setPosition(t.getLine, t.getCharPositionInLine) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:361: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Some((t.getText, [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:372: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val (matches, nonMatches) = remainingNodes.partition(_.getText.toUpperCase == clauseName) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:373: value ++ is not a member of Any [error] remainingNodes = nonMatches ++ (if (matches.nonEmpty) matches.tail else Nil) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:373: value nonEmpty is not a member of Any [error] remainingNodes = nonMatches ++ (if (matches.nonEmpty) matches.tail else Nil) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:373: value tail is not a member of Any [error] remainingNodes = nonMatches ++ (if (matches.nonEmpty) matches.tail else Nil) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:374: value headOption is not a member of Any [error] matches.headOption [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:390: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] nodeList.filter { case ast: ASTNode => ast.getText == clauseName } match { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:407: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] DecimalType(precision.getText.toInt, scale.getText.toInt) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:407: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] DecimalType(precision.getText.toInt, scale.getText.toInt) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:409: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] DecimalType(precision.getText.toInt, 0) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:480: value toStringTree is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val keyMap = keyASTs.map(_.toStringTree).zipWithIndex.toMap [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:480: Cannot construct a collection of type That with elements of type (A1, Int) based on a collection of type Seq[Nothing]. [error] val keyMap = keyASTs.map(_.toStringTree).zipWithIndex.toMap [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:486: value toStringTree is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val colString = col.asInstanceOf[ASTNode].toStringTree() [error] ^ [error] Class org.antlr.runtime.TokenRewriteStream not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:565: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if noExplainCommands.contains(explainArgs.head.getText) => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:568: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if "TOK_CREATETABLE" == explainArgs.head.getText => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:606: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] UnresolvedRelation(TableIdentifier(tableName.getText), None), [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:672: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val comment = BaseSemanticAnalyzer.unescapeSQLString(child.getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:752: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val comment = BaseSemanticAnalyzer.unescapeSQLString(child.getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:767: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val fieldDelim = BaseSemanticAnalyzer.unescapeSQLString (rowChild1.getText()) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:771: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val fieldEscape = BaseSemanticAnalyzer.unescapeSQLString (rowChild2(0).getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:775: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val collItemDelim = BaseSemanticAnalyzer.unescapeSQLString(rowChild.getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:778: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val mapKeyDelim = BaseSemanticAnalyzer.unescapeSQLString(rowChild.getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:781: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val lineDelim = BaseSemanticAnalyzer.unescapeSQLString(rowChild.getText) [error] ^ [warn] Class org.antlr.runtime.tree.Tree not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.Tree not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:790: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val nullFormat = BaseSemanticAnalyzer.unescapeSQLString(rowChild.getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:797: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] var location = BaseSemanticAnalyzer.unescapeSQLString(child.getText) [error] ^ [warn] Class org.json.JSONObject not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:802: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] serde = Option(BaseSemanticAnalyzer.unescapeSQLString(child.getChild(0).getText))) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:803: value getChildCount is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if (child.getChildCount == 2) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:806: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] (child.getChild(1).getChild(0)).asInstanceOf[ASTNode], serdeParams) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:811: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] child.getText().toLowerCase(Locale.ENGLISH) match { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:866: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] s"Unrecognized file format in STORED AS clause: ${child.getText}") [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:885: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Option(BaseSemanticAnalyzer.unescapeSQLString(list.getChild(0).getText)), [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:887: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Option(BaseSemanticAnalyzer.unescapeSQLString(list.getChild(1).getText))) [error] ^ [error] Class org.antlr.runtime.tree.Tree not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:903: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if Seq("TOK_FROM", "TOK_INSERT").contains(queryArgs.head.getText) => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1085: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] .asInstanceOf[ASTNode].getText [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1232: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] .asInstanceOf[ASTNode].getText [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1247: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if (clauses.last.getText.startsWith("TOK")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1289: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] s"""No parse rules for sampling clause: ${a.getType}, text: ${a.getText} : [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1289: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] s"""No parse rules for sampling clause: ${a.getType}, text: ${a.getText} : [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1297: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case (arg, i) => arg.getText == "TOK_TABREF" [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1541: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Cast(nodeToExpr(arg), DecimalType(precision.getText.toInt, scale.getText.toInt)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1541: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Cast(nodeToExpr(arg), DecimalType(precision.getText.toInt, scale.getText.toInt)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1543: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Cast(nodeToExpr(arg), DecimalType(precision.getText.toInt, 0)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1652: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(strings.map(s => BaseSemanticAnalyzer.unescapeSQLString(s.getText)).mkString) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1656: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if numericAstTypes contains ast.getType => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1659: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if (ast.getText.endsWith("L")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1661: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toLong, LongType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1661: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toLong, LongType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1662: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] } else if (ast.getText.endsWith("S")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1664: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toShort, ShortType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1664: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toShort, ShortType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1665: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] } else if (ast.getText.endsWith("Y")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1667: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toByte, ByteType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1667: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toByte, ByteType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1668: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] } else if (ast.getText.endsWith("BD") || ast.getText.endsWith("D")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1668: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] } else if (ast.getText.endsWith("BD") || ast.getText.endsWith("D")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1670: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val strVal = ast.getText.stripSuffix("D").stripSuffix("B") [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1673: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.toDouble, DoubleType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1674: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.toLong, LongType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1675: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.toInt, IntegerType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1682: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] sys.error(s"Failed to parse number '${ast.getText}'.") [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1687: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.StringLiteral => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1688: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(BaseSemanticAnalyzer.unescapeSQLString(ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1690: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_DATELITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1691: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(Date.valueOf(ast.getText.substring(1, ast.getText.length - 1))) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1691: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(Date.valueOf(ast.getText.substring(1, ast.getText.length - 1))) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1693: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_CHARSETLITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1694: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(BaseSemanticAnalyzer.charSetString(ast.getChild(0).getText, ast.getChild(1).getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1694: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(BaseSemanticAnalyzer.charSetString(ast.getChild(0).getText, ast.getChild(1).getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1696: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_YEAR_MONTH_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1697: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromYearMonthString(ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1699: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_DAY_TIME_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1700: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromDayTimeString(ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1702: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_YEAR_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1703: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromSingleUnitString("year", ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1705: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_MONTH_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1706: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromSingleUnitString("month", ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1708: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_DAY_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1709: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromSingleUnitString("day", ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1711: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_HOUR_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1712: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromSingleUnitString("hour", ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1714: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_MINUTE_LITERAL => [error] ^ [warn] 1778 warnings found [error] 112 errors found [error] Compile failed at Apr 16, 2016 4:51:19 AM [3.663s] [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [03:14 min] [INFO] Spark Project Test Tags ............................ SUCCESS [01:00 min] [INFO] Spark Project Launcher ............................. SUCCESS [01:22 min] [INFO] Spark Project Networking ........................... SUCCESS [ 13.477 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 9.245 s] [INFO] Spark Project Unsafe ............................... SUCCESS [ 14.595 s] [INFO] Spark Project Core ................................. SUCCESS [04:17 min] [INFO] Spark Project Bagel ................................ SUCCESS [ 4.719 s] [INFO] Spark Project GraphX ............................... SUCCESS [ 22.595 s] [INFO] Spark Project Streaming ............................ SUCCESS [ 37.540 s] [INFO] Spark Project Catalyst ............................. SUCCESS [ 51.130 s] [INFO] Spark Project SQL .................................. SUCCESS [01:16 min] [INFO] Spark Project ML Library ........................... SUCCESS [02:02 min] [INFO] Spark Project Tools ................................ SUCCESS [ 1.967 s] [INFO] Spark Project Hive ................................. FAILURE [ 55.443 s] [INFO] Spark Project Docker Integration Tests ............. SKIPPED [INFO] Spark Project REPL ................................. SKIPPED [INFO] Spark Project YARN Shuffle Service ................. SKIPPED [INFO] Spark Project YARN ................................. SKIPPED [INFO] Spark Project Assembly ............................. SKIPPED [INFO] Spark Project External Twitter ..................... SKIPPED [INFO] Spark Project External Flume Sink .................. SKIPPED [INFO] Spark Project External Flume ....................... SKIPPED [INFO] Spark Project External Flume Assembly .............. SKIPPED [INFO] Spark Project External MQTT ........................ SKIPPED [INFO] Spark Project External MQTT Assembly ............... SKIPPED [INFO] Spark Project External ZeroMQ ...................... SKIPPED [INFO] Spark Project External Kafka ....................... SKIPPED [INFO] Spark Project Examples ............................. SKIPPED [INFO] Spark Project External Kafka Assembly .............. SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 16:46 min [INFO] Finished at: 2016-04-16T04:51:19+00:00 [INFO] Final Memory: 58M/529M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-hive_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :spark-hive_2.11 ==> ERROR: A failure occurred in build(). Aborting...

discord commented on 2016-04-16 04:13 (UTC)

Could my local .m2 settings be getting in the way? makepkg -s ==> Making package: apache-spark 1.6.1-1 (Sat Apr 16 04:04:59 UTC 2016) ==> Checking runtime dependencies... ==> Checking buildtime dependencies... ==> Retrieving sources... -> Downloading spark-1.6.1.tgz... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 11.6M 100 11.6M 0 0 5375k 0 0:00:02 0:00:02 --:--:-- 5376k -> Found apache-spark-standalone.service -> Found spark-env.sh ==> Validating source files with md5sums... spark-1.6.1.tgz ... Passed apache-spark-standalone.service ... Passed spark-env.sh ... Passed ==> Extracting sources... -> Extracting spark-1.6.1.tgz with bsdtar ==> Starting prepare()... ==> Starting build()... dev/../assembly/pom.xml dev/../external/mqtt/pom.xml dev/../external/zeromq/pom.xml dev/../external/mqtt-assembly/pom.xml dev/../external/twitter/pom.xml dev/../external/flume-sink/pom.xml dev/../external/kafka-assembly/pom.xml dev/../external/flume-assembly/pom.xml dev/../external/kafka/pom.xml dev/../external/flume/pom.xml dev/../bagel/pom.xml dev/../mllib/pom.xml dev/../launcher/pom.xml dev/../sql/catalyst/pom.xml dev/../sql/hive-thriftserver/pom.xml dev/../sql/hive/pom.xml dev/../sql/core/pom.xml dev/../network/common/pom.xml dev/../network/shuffle/pom.xml dev/../network/yarn/pom.xml dev/../graphx/pom.xml dev/../tags/pom.xml dev/../examples/pom.xml dev/../streaming/pom.xml dev/../tools/pom.xml dev/../extras/kinesis-asl-assembly/pom.xml dev/../extras/java8-tests/pom.xml dev/../extras/kinesis-asl/pom.xml dev/../extras/spark-ganglia-lgpl/pom.xml dev/../unsafe/pom.xml dev/../docker-integration-tests/pom.xml dev/../dev/audit-release/blank_maven_build/pom.xml dev/../dev/audit-release/maven_app_core/pom.xml dev/../yarn/pom.xml dev/../pom.xml dev/../repl/pom.xml dev/../core/pom.xml dev/../docs/_plugins/copy_api_dirs.rb +++ dirname ./make-distribution.sh ++ cd . ++ pwd + SPARK_HOME=/home/colin/build/AUR/apache-spark/src/spark-1.6.1 + DISTDIR=/home/colin/build/AUR/apache-spark/src/spark-1.6.1/dist + SPARK_TACHYON=false + TACHYON_VERSION=0.8.2 + TACHYON_TGZ=tachyon-0.8.2-bin.tar.gz + TACHYON_URL=http://tachyon-project.org/downloads/files/0.8.2/tachyon-0.8.2-bin.tar.gz + MAKE_TGZ=false + NAME=none + MVN=/home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn + (( 9 )) + case $1 in + break + '[' -z /usr/lib/jvm/default-runtime ']' + '[' -z /usr/lib/jvm/default-runtime ']' ++ command -v git + '[' /usr/bin/git ']' ++ git rev-parse --short HEAD ++ : + GITREV= + '[' '!' -z '' ']' + unset GITREV ++ command -v /home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn + '[' '!' /home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn ']' ++ /home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn help:evaluate -Dexpression=project.version -Pscala-2.11 -DskipTests -Dmaven.repo.local=/tmp -DautoVersionSubmodules=true -U -Djline.version=2.13 -Djline.groupid=jline -Pyarn -Phadoop-2.6 ++ grep -v INFO ++ tail -n 1 + VERSION='[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/UnresolvableModelException' ==> ERROR: A failure occurred in build(). Aborting...

Knight commented on 2016-03-12 10:25 (UTC)

@huitseeker, sorry for the delay. I don't know why but it's weird that the installation succeeded this time...

huitseeker commented on 2016-03-09 08:11 (UTC)

Hi @Knight, The exact nature of the error is above the maven output you paste here. without it, it's difficult to interpret this message.

Knight commented on 2016-03-09 07:12 (UTC)

[INFO] Spark Project Parent POM ........................... SUCCESS [04:12 min] [INFO] Spark Project Test Tags ............................ SUCCESS [ 58.838 s] [INFO] Spark Project Launcher ............................. SUCCESS [01:34 min] [INFO] Spark Project Networking ........................... SUCCESS [ 31.249 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 36.530 s] [INFO] Spark Project Unsafe ............................... SUCCESS [01:00 min] [INFO] Spark Project Core ................................. FAILURE [04:08 min] SKIP... [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 13:05 min [INFO] Finished at: 2016-03-09T15:04:30+08:00 [INFO] Final Memory: 59M/735M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-core_2.11: Exe cution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> [Help 1] [ERROR] what should I do?

Aetf commented on 2016-02-23 04:43 (UTC)

The the package function should be @@ -55,7 +55,7 @@ package() { mkdir -p $pkgdir/etc/profile.d echo '#!/bin/sh' > $pkgdir/etc/profile.d/apache-spark.sh - echo 'SPARK_HOME=$pkgdir/usr/share/apache-spark' >> $pkgdir/etc/profile.d/apache-spark.sh + echo 'SPARK_HOME=/usr/share/apache-spark' >> $pkgdir/etc/profile.d/apache-spark.sh echo 'export SPARK_HOME' >> $pkgdir/etc/profile.d/apache-spark.sh chmod 755 $pkgdir/etc/profile.d/apache-spark.sh

aftabnack commented on 2016-02-17 07:35 (UTC) (edited on 2016-02-17 07:38 (UTC) by aftabnack)

My compiling is failed at the point of building MQTT, every time its running out of memory. I'm running arch 32-bit in virtual box with 5GB RAM. It failed with outOfMemory error with message saying "malloc failed to allocate 171128 bytes for chunk::new" I have run it about 8times. I tried modifying PKGBUILD and editing MAVEN_OPTS with different values for -Xmx MaxPermSize. I can share the hs_err/replay file.

oneeyed commented on 2016-01-07 18:38 (UTC)

Adding two files in /etc/profile.d (sh and csh) to set SPARK_HOME would definitely help users of this package.

StefanK2 commented on 2016-01-05 14:46 (UTC) (edited on 2016-01-05 15:09 (UTC) by StefanK2)

LOL I found the problem, SPARK_HOME wasn't set ;-) Everything is working now. --- I just updated the package, but now I get this error-message when I call spark-submit or spark-shell: ls: cannot access /usr/assembly/target/scala-2.10: No such file or directory Failed to find Spark assembly in /usr/assembly/target/scala-2.10. You need to build Spark before running this program. According to pacman scala 2.11 is installed on my machine. Also when I checked in apache-spark/src/spark-1.6.0/assembly/target there is only a folder scala-2.11. Did I miss some configuration-step or something? I changed the line 36 in PKGBUILD from dev/change-version-to-2.11.sh to dev/change-scala-version.sh 2.11 to get rid of the deprecation warning (I assumed that my problem is related to the warning, but it didn't change it). Also I just found this: https://issues.apache.org/jira/browse/SPARK-7074 seems to be a spark problem.

huitseeker commented on 2015-12-21 14:44 (UTC)

@fosskers I really can't reproduce your issue. Do you have more info on your config ? Have you thought of clearing your maven cache before rebuilding ?

flying-sheep commented on 2015-11-23 14:16 (UTC)

yes, do this please. 1. it’s compiled from git master, so it should have a -git suffix after the name 3. the version is wrong (1.6.0 isn’t released yet, you’ll have to use some version generated by a pkgver() function) 2. the underscore is against convention (except if the name literally has an underscore in it)

fosskers commented on 2015-11-19 20:59 (UTC)

Have spent the last few hours trying to compile this. At the point of building `spark-streaming-mqtt`I always either run out of RAM (I have 4G on my laptop) or I see this: [INFO] --- maven-assembly-plugin:2.5.5:single (test-jar-with-dependencies) @ spark-streaming-mqtt_2.11 --- [INFO] Reading assembly descriptor: src/main/assembly/assembly.xml [WARNING] The assembly descriptor contains a filesystem-root relative reference, which is not cross platform compatible / [INFO] Building jar: /home/colin/building/apache-spark/src/spark-1.5.2/external/mqtt/target/scala-2.11/spark-streaming-mqtt-test-1.5.2.jar /home/colin/building/apache-spark/src/spark-1.5.2/build/mvn: line 152: 28401 Killed ${MVN_BIN} -DzincPort=${ZINC_PORT} "$@" Any ideas?

mister_karim commented on 2015-11-15 10:28 (UTC)

Thanks for the package, please change the name to apache-spark-git, like that people can understand that you are using always the last git version.

hanzeil commented on 2015-10-29 01:58 (UTC) (edited on 2015-10-29 01:59 (UTC) by hanzeil)

this is not working well for me eroor like this : ++ grep -v INFO ++ tail -n 1 VERSION='[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/UnresolvableModelException' why? Thanks

michaelgod commented on 2015-09-27 11:36 (UTC)

@wellichen updated to 1.6.0, thank you for notifying!

wellichen commented on 2015-09-24 14:59 (UTC)

This is not working out of the box anymore, had to edit the PKGBUILD and put there pkgver=1.6.0 in order to have makepkg working correctly. Please update it as soon as you can. Hugs!

wushuzh commented on 2015-08-17 10:54 (UTC)

The latest hadoop in AUR is 2.7.0, I wonder the spark build command should align with it ?