Package Details: apache-spark 3.5.0-1

Git Clone URL: https://aur.archlinux.org/apache-spark.git (read-only, click to copy)
Package Base: apache-spark
Description: A unified analytics engine for large-scale data processing
Upstream URL: http://spark.apache.org
Keywords: spark
Licenses: Apache
Submitter: huitseeker
Maintainer: None
Last Packager: ttc0419
Votes: 57
Popularity: 0.019845
First Submitted: 2015-10-04 09:31 (UTC)
Last Updated: 2023-09-29 13:49 (UTC)

Dependencies (2)

Required by (2)

Sources (4)

Latest Comments

« First ‹ Previous 1 2 3 4 5 6 7 Next › Last »

mtrokic commented on 2016-08-13 17:20 (UTC)

I think there are dependencies which are missing. After installing gcc-fortran and postgresql-libs I was able to compile successfully.

sidec commented on 2016-07-10 20:16 (UTC) (edited on 2016-07-10 20:19 (UTC) by sidec)

After successful makepkg -sri I fail to run spark-shell, I get this message insteed: ls: cannot access '/usr/assembly/target/scala-2.10': No such file or directory Failed to find Spark assembly in /usr/assembly/target/scala-2.10. You need to build Spark before running this program.

axelmagn commented on 2016-06-29 19:24 (UTC)

I can confirm discord's issue. I am having the same problem.

huitseeker commented on 2016-05-03 15:40 (UTC)

I have trouble reproducing your issue, sorry.

discord commented on 2016-04-26 18:31 (UTC)

Considering removing hive from the pom build and re-building, since I don't use it. However not sure why this works for anyone except myself.

discord commented on 2016-04-16 21:11 (UTC)

I resolved the previous issue by connecting to my maven repository. However I'm having build issues now: [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-hive_2.11 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] Copying 3 resources [INFO] [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-hive_2.11 --- [WARNING] Failed to build parent project for org.spark-project.hive:hive-cli:jar:1.2.1.spark [WARNING] Invalid POM for org.spark-project.hive:hive-cli:jar:1.2.1.spark, transitive dependencies (if any) will not be available, enable debug logging for more details [WARNING] Failed to build parent project for org.spark-project.hive:hive-exec:jar:1.2.1.spark [WARNING] Invalid POM for org.spark-project.hive:hive-exec:jar:1.2.1.spark, transitive dependencies (if any) will not be available, enable debug logging for more details [WARNING] Failed to build parent project for org.spark-project.hive:hive-metastore:jar:1.2.1.spark [WARNING] Invalid POM for org.spark-project.hive:hive-metastore:jar:1.2.1.spark, transitive dependencies (if any) will not be available, enable debug logging for more details [INFO] Using zinc server for incremental compilation [info] Compiling 28 Scala sources and 1 Java source to /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/target/scala-2.11/classes... [warn] Class org.antlr.runtime.tree.CommonTree not found - continuing with a stub. [warn] Class org.antlr.runtime.Token not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.Tree not found - continuing with a stub. [error] Class org.antlr.runtime.tree.CommonTree not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. .... [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.IntStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [error] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.CommonTree not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. ... [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.IntStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [error] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. ... [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. ... [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.IntStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [error] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.Parser not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [warn] Class org.antlr.runtime.IntStream not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.BitSet not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [error] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognitionException not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [warn] Class org.antlr.runtime.RecognizerSharedState not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:221: value token is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] n.token.asInstanceOf[org.antlr.runtime.CommonToken].setText(newText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:221: object antlr is not a member of package org [error] n.token.asInstanceOf[org.antlr.runtime.CommonToken].setText(newText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:229: value getChildCount is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] (1 to n.getChildCount).foreach(_ => n.deleteChild(0)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:229: value deleteChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] (1 to n.getChildCount).foreach(_ => n.deleteChild(0)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:230: value addChildren is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] n.addChildren(newChildren.asJava) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:246: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] check("type", _.getType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:247: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] check("text", _.getText) [error] ^ [warn] Class org.antlr.runtime.TokenRewriteStream not found - continuing with a stub. [warn] Class org.antlr.runtime.TokenRewriteStream not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.TreeAdaptor not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:304: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val plan = if (nativeCommands contains tree.getText) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:345: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] assert(tree.asInstanceOf[ASTNode].getText == "TOK_CREATETABLE", "Only CREATE TABLE supported.") [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:349: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] .find(_.asInstanceOf[ASTNode].getText == "TOK_TABCOLLIST") [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:360: value getLine is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] CurrentOrigin.setPosition(t.getLine, t.getCharPositionInLine) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:360: value getCharPositionInLine is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] CurrentOrigin.setPosition(t.getLine, t.getCharPositionInLine) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:361: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Some((t.getText, [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:372: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val (matches, nonMatches) = remainingNodes.partition(_.getText.toUpperCase == clauseName) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:373: value ++ is not a member of Any [error] remainingNodes = nonMatches ++ (if (matches.nonEmpty) matches.tail else Nil) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:373: value nonEmpty is not a member of Any [error] remainingNodes = nonMatches ++ (if (matches.nonEmpty) matches.tail else Nil) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:373: value tail is not a member of Any [error] remainingNodes = nonMatches ++ (if (matches.nonEmpty) matches.tail else Nil) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:374: value headOption is not a member of Any [error] matches.headOption [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:390: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] nodeList.filter { case ast: ASTNode => ast.getText == clauseName } match { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:407: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] DecimalType(precision.getText.toInt, scale.getText.toInt) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:407: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] DecimalType(precision.getText.toInt, scale.getText.toInt) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:409: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] DecimalType(precision.getText.toInt, 0) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:480: value toStringTree is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val keyMap = keyASTs.map(_.toStringTree).zipWithIndex.toMap [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:480: Cannot construct a collection of type That with elements of type (A1, Int) based on a collection of type Seq[Nothing]. [error] val keyMap = keyASTs.map(_.toStringTree).zipWithIndex.toMap [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:486: value toStringTree is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val colString = col.asInstanceOf[ASTNode].toStringTree() [error] ^ [error] Class org.antlr.runtime.TokenRewriteStream not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:565: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if noExplainCommands.contains(explainArgs.head.getText) => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:568: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if "TOK_CREATETABLE" == explainArgs.head.getText => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:606: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] UnresolvedRelation(TableIdentifier(tableName.getText), None), [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:672: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val comment = BaseSemanticAnalyzer.unescapeSQLString(child.getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:752: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val comment = BaseSemanticAnalyzer.unescapeSQLString(child.getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:767: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val fieldDelim = BaseSemanticAnalyzer.unescapeSQLString (rowChild1.getText()) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:771: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val fieldEscape = BaseSemanticAnalyzer.unescapeSQLString (rowChild2(0).getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:775: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val collItemDelim = BaseSemanticAnalyzer.unescapeSQLString(rowChild.getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:778: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val mapKeyDelim = BaseSemanticAnalyzer.unescapeSQLString(rowChild.getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:781: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val lineDelim = BaseSemanticAnalyzer.unescapeSQLString(rowChild.getText) [error] ^ [warn] Class org.antlr.runtime.tree.Tree not found - continuing with a stub. [warn] Class org.antlr.runtime.tree.Tree not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:790: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val nullFormat = BaseSemanticAnalyzer.unescapeSQLString(rowChild.getText) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:797: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] var location = BaseSemanticAnalyzer.unescapeSQLString(child.getText) [error] ^ [warn] Class org.json.JSONObject not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:802: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] serde = Option(BaseSemanticAnalyzer.unescapeSQLString(child.getChild(0).getText))) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:803: value getChildCount is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if (child.getChildCount == 2) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:806: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] (child.getChild(1).getChild(0)).asInstanceOf[ASTNode], serdeParams) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:811: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] child.getText().toLowerCase(Locale.ENGLISH) match { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:866: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] s"Unrecognized file format in STORED AS clause: ${child.getText}") [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:885: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Option(BaseSemanticAnalyzer.unescapeSQLString(list.getChild(0).getText)), [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:887: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Option(BaseSemanticAnalyzer.unescapeSQLString(list.getChild(1).getText))) [error] ^ [error] Class org.antlr.runtime.tree.Tree not found - continuing with a stub. [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:903: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if Seq("TOK_FROM", "TOK_INSERT").contains(queryArgs.head.getText) => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1085: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] .asInstanceOf[ASTNode].getText [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1232: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] .asInstanceOf[ASTNode].getText [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1247: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if (clauses.last.getText.startsWith("TOK")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1289: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] s"""No parse rules for sampling clause: ${a.getType}, text: ${a.getText} : [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1289: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] s"""No parse rules for sampling clause: ${a.getType}, text: ${a.getText} : [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1297: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case (arg, i) => arg.getText == "TOK_TABREF" [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1541: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Cast(nodeToExpr(arg), DecimalType(precision.getText.toInt, scale.getText.toInt)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1541: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Cast(nodeToExpr(arg), DecimalType(precision.getText.toInt, scale.getText.toInt)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1543: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Cast(nodeToExpr(arg), DecimalType(precision.getText.toInt, 0)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1652: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(strings.map(s => BaseSemanticAnalyzer.unescapeSQLString(s.getText)).mkString) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1656: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if numericAstTypes contains ast.getType => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1659: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] if (ast.getText.endsWith("L")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1661: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toLong, LongType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1661: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toLong, LongType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1662: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] } else if (ast.getText.endsWith("S")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1664: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toShort, ShortType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1664: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toShort, ShortType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1665: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] } else if (ast.getText.endsWith("Y")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1667: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toByte, ByteType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1667: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.substring(0, ast.getText.length() - 1).toByte, ByteType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1668: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] } else if (ast.getText.endsWith("BD") || ast.getText.endsWith("D")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1668: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] } else if (ast.getText.endsWith("BD") || ast.getText.endsWith("D")) { [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1670: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] val strVal = ast.getText.stripSuffix("D").stripSuffix("B") [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1673: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.toDouble, DoubleType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1674: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.toLong, LongType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1675: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] v = Literal.create(ast.getText.toInt, IntegerType) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1682: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] sys.error(s"Failed to parse number '${ast.getText}'.") [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1687: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.StringLiteral => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1688: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(BaseSemanticAnalyzer.unescapeSQLString(ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1690: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_DATELITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1691: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(Date.valueOf(ast.getText.substring(1, ast.getText.length - 1))) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1691: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(Date.valueOf(ast.getText.substring(1, ast.getText.length - 1))) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1693: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_CHARSETLITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1694: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(BaseSemanticAnalyzer.charSetString(ast.getChild(0).getText, ast.getChild(1).getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1694: value getChild is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(BaseSemanticAnalyzer.charSetString(ast.getChild(0).getText, ast.getChild(1).getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1696: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_YEAR_MONTH_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1697: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromYearMonthString(ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1699: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_DAY_TIME_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1700: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromDayTimeString(ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1702: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_YEAR_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1703: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromSingleUnitString("year", ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1705: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_MONTH_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1706: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromSingleUnitString("month", ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1708: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_DAY_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1709: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromSingleUnitString("day", ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1711: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_HOUR_LITERAL => [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1712: value getText is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] Literal(CalendarInterval.fromSingleUnitString("hour", ast.getText)) [error] ^ [error] /home/colin/build/AUR/apache-spark/src/spark-1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala:1714: value getType is not a member of org.apache.hadoop.hive.ql.parse.ASTNode [error] case ast: ASTNode if ast.getType == HiveParser.TOK_INTERVAL_MINUTE_LITERAL => [error] ^ [warn] 1778 warnings found [error] 112 errors found [error] Compile failed at Apr 16, 2016 4:51:19 AM [3.663s] [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [03:14 min] [INFO] Spark Project Test Tags ............................ SUCCESS [01:00 min] [INFO] Spark Project Launcher ............................. SUCCESS [01:22 min] [INFO] Spark Project Networking ........................... SUCCESS [ 13.477 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 9.245 s] [INFO] Spark Project Unsafe ............................... SUCCESS [ 14.595 s] [INFO] Spark Project Core ................................. SUCCESS [04:17 min] [INFO] Spark Project Bagel ................................ SUCCESS [ 4.719 s] [INFO] Spark Project GraphX ............................... SUCCESS [ 22.595 s] [INFO] Spark Project Streaming ............................ SUCCESS [ 37.540 s] [INFO] Spark Project Catalyst ............................. SUCCESS [ 51.130 s] [INFO] Spark Project SQL .................................. SUCCESS [01:16 min] [INFO] Spark Project ML Library ........................... SUCCESS [02:02 min] [INFO] Spark Project Tools ................................ SUCCESS [ 1.967 s] [INFO] Spark Project Hive ................................. FAILURE [ 55.443 s] [INFO] Spark Project Docker Integration Tests ............. SKIPPED [INFO] Spark Project REPL ................................. SKIPPED [INFO] Spark Project YARN Shuffle Service ................. SKIPPED [INFO] Spark Project YARN ................................. SKIPPED [INFO] Spark Project Assembly ............................. SKIPPED [INFO] Spark Project External Twitter ..................... SKIPPED [INFO] Spark Project External Flume Sink .................. SKIPPED [INFO] Spark Project External Flume ....................... SKIPPED [INFO] Spark Project External Flume Assembly .............. SKIPPED [INFO] Spark Project External MQTT ........................ SKIPPED [INFO] Spark Project External MQTT Assembly ............... SKIPPED [INFO] Spark Project External ZeroMQ ...................... SKIPPED [INFO] Spark Project External Kafka ....................... SKIPPED [INFO] Spark Project Examples ............................. SKIPPED [INFO] Spark Project External Kafka Assembly .............. SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 16:46 min [INFO] Finished at: 2016-04-16T04:51:19+00:00 [INFO] Final Memory: 58M/529M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-hive_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :spark-hive_2.11 ==> ERROR: A failure occurred in build(). Aborting...

discord commented on 2016-04-16 04:13 (UTC)

Could my local .m2 settings be getting in the way? makepkg -s ==> Making package: apache-spark 1.6.1-1 (Sat Apr 16 04:04:59 UTC 2016) ==> Checking runtime dependencies... ==> Checking buildtime dependencies... ==> Retrieving sources... -> Downloading spark-1.6.1.tgz... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 11.6M 100 11.6M 0 0 5375k 0 0:00:02 0:00:02 --:--:-- 5376k -> Found apache-spark-standalone.service -> Found spark-env.sh ==> Validating source files with md5sums... spark-1.6.1.tgz ... Passed apache-spark-standalone.service ... Passed spark-env.sh ... Passed ==> Extracting sources... -> Extracting spark-1.6.1.tgz with bsdtar ==> Starting prepare()... ==> Starting build()... dev/../assembly/pom.xml dev/../external/mqtt/pom.xml dev/../external/zeromq/pom.xml dev/../external/mqtt-assembly/pom.xml dev/../external/twitter/pom.xml dev/../external/flume-sink/pom.xml dev/../external/kafka-assembly/pom.xml dev/../external/flume-assembly/pom.xml dev/../external/kafka/pom.xml dev/../external/flume/pom.xml dev/../bagel/pom.xml dev/../mllib/pom.xml dev/../launcher/pom.xml dev/../sql/catalyst/pom.xml dev/../sql/hive-thriftserver/pom.xml dev/../sql/hive/pom.xml dev/../sql/core/pom.xml dev/../network/common/pom.xml dev/../network/shuffle/pom.xml dev/../network/yarn/pom.xml dev/../graphx/pom.xml dev/../tags/pom.xml dev/../examples/pom.xml dev/../streaming/pom.xml dev/../tools/pom.xml dev/../extras/kinesis-asl-assembly/pom.xml dev/../extras/java8-tests/pom.xml dev/../extras/kinesis-asl/pom.xml dev/../extras/spark-ganglia-lgpl/pom.xml dev/../unsafe/pom.xml dev/../docker-integration-tests/pom.xml dev/../dev/audit-release/blank_maven_build/pom.xml dev/../dev/audit-release/maven_app_core/pom.xml dev/../yarn/pom.xml dev/../pom.xml dev/../repl/pom.xml dev/../core/pom.xml dev/../docs/_plugins/copy_api_dirs.rb +++ dirname ./make-distribution.sh ++ cd . ++ pwd + SPARK_HOME=/home/colin/build/AUR/apache-spark/src/spark-1.6.1 + DISTDIR=/home/colin/build/AUR/apache-spark/src/spark-1.6.1/dist + SPARK_TACHYON=false + TACHYON_VERSION=0.8.2 + TACHYON_TGZ=tachyon-0.8.2-bin.tar.gz + TACHYON_URL=http://tachyon-project.org/downloads/files/0.8.2/tachyon-0.8.2-bin.tar.gz + MAKE_TGZ=false + NAME=none + MVN=/home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn + (( 9 )) + case $1 in + break + '[' -z /usr/lib/jvm/default-runtime ']' + '[' -z /usr/lib/jvm/default-runtime ']' ++ command -v git + '[' /usr/bin/git ']' ++ git rev-parse --short HEAD ++ : + GITREV= + '[' '!' -z '' ']' + unset GITREV ++ command -v /home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn + '[' '!' /home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn ']' ++ /home/colin/build/AUR/apache-spark/src/spark-1.6.1/build/mvn help:evaluate -Dexpression=project.version -Pscala-2.11 -DskipTests -Dmaven.repo.local=/tmp -DautoVersionSubmodules=true -U -Djline.version=2.13 -Djline.groupid=jline -Pyarn -Phadoop-2.6 ++ grep -v INFO ++ tail -n 1 + VERSION='[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/UnresolvableModelException' ==> ERROR: A failure occurred in build(). Aborting...

Knight commented on 2016-03-12 10:25 (UTC)

@huitseeker, sorry for the delay. I don't know why but it's weird that the installation succeeded this time...

huitseeker commented on 2016-03-09 08:11 (UTC)

Hi @Knight, The exact nature of the error is above the maven output you paste here. without it, it's difficult to interpret this message.

Knight commented on 2016-03-09 07:12 (UTC)

[INFO] Spark Project Parent POM ........................... SUCCESS [04:12 min] [INFO] Spark Project Test Tags ............................ SUCCESS [ 58.838 s] [INFO] Spark Project Launcher ............................. SUCCESS [01:34 min] [INFO] Spark Project Networking ........................... SUCCESS [ 31.249 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 36.530 s] [INFO] Spark Project Unsafe ............................... SUCCESS [01:00 min] [INFO] Spark Project Core ................................. FAILURE [04:08 min] SKIP... [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 13:05 min [INFO] Finished at: 2016-03-09T15:04:30+08:00 [INFO] Final Memory: 59M/735M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-core_2.11: Exe cution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> [Help 1] [ERROR] what should I do?