Package Details: hadoop 3.3.5-2

Git Clone URL: https://aur.archlinux.org/hadoop.git (read-only, click to copy)
Package Base: hadoop
Description: An open-source software for reliable, scalable, distributed computing
Upstream URL: https://hadoop.apache.org/
Keywords: big-data java map-reduce
Licenses: Apache
Conflicts: yarn
Submitter: sjakub
Maintainer: ttc0419
Last Packager: ttc0419
Votes: 84
Popularity: 0.037510
First Submitted: 2009-04-07 16:39 (UTC)
Last Updated: 2023-06-25 06:58 (UTC)

Dependencies (10)

Required by (3)

Sources (8)

Latest Comments

1 2 3 4 5 6 .. 8 Next › Last »

sourcandy commented on 2023-10-11 15:15 (UTC)

@ttc0419 I see that you maintained both Hadoop and Spark packages, but both have different directory structures that makes me go insane for the first few days. Is it possible to move everything centralized to /opt/apache-hadoop instead (like most AUR packages)? Is it for historical purpose? Because it doesn't even follow the default Hadoop directory structures (which search for log and config file in /usr directory) if you run it directly.

I also spotted a little inconsistencies between /etc/profile.d/hadoop.sh and /etc/conf.d/hadoop where JAVA_HOME is not set on one of them.

Other than the Java error, I think /etc/profile.d/hadoop.sh should be modified to something like this to make it consistent when running it manually (e.g. as root) vs when running it via hadoop daemons:

export HADOOP_COMMON_LIB_NATIVE_DIR=/usr/lib
export HADOOP_CONF_DIR=/etc/hadoop
export HADOOP_LOG_DIR=/var/log/hadoop
export HADOOP_USERNAME=hadoop

# Consistently use "hadoop" even when running as root or other user
# Make it behave like daemons and prevent creating wrong DFS path (username based)
export HDFS_NAMENODE_USER=hadoop
export HDFS_DATANODE_USER=hadoop
export HDFS_SECONDARYNAMENODE_USER=hadoop
export YARN_NODEMANAGER_USER=hadoop
export YARN_RESOURCEMANAGER_USER=hadoop

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk

Also I created this guide in case some of you poor souls have the same problems as me when setting up Hadoop on Arch (since I can't find a good one and the Arch wiki for Hadoop is kinda outdated).

lllf commented on 2023-10-09 10:26 (UTC) (edited on 2023-10-09 10:26 (UTC) by lllf)

Had to put in this grotesque hack into build() to get it to build due to multiple errors: 1. C++17 not being set 2. Linker errors 3. Test not compiling 4. Yarn dependency requiring node 14 when node 12 is baked in

sed -i 's#<JVM_ARCH_DATA_MODEL>${sun.arch.data.model}</JVM_ARCH_DATA_MODEL>#<JVM_ARCH_DATA_MODEL>${sun.arch.data.model}</JVM_ARCH_DATA_MODEL><CMAKE_CXX_STANDARD>17</CMAKE_CXX_STANDARD><CMAKE_EXE_LINKER_FLAGS>-Wl,--copy-dt-needed-entries</CMAKE_EXE_LINKER_FLAGS><HDFSPP_LIBRARY_ONLY>1</HDFSPP_LIBRARY_ONLY>#' hadoop-rel-release-${pkgver}/hadoop-hdfs-project/hadoop-hdfs-native-client/pom.xml
sed -i 's#<id>yarn install</id>#<id>yarn install</id><configuration><arguments>install --ignore-engines</arguments></configuration>#' hadoop-rel-release-${pkgver}/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-catalog/hadoop-yarn-applications-catalog-webapp/pom.xml

lllf commented on 2023-10-09 06:41 (UTC)

No longer building due to:

   10 [WARNING] [ 31%] Built target copy_hadoop_files
    9 [WARNING] make[2]: Leaving directory '/build/hadoop/src/hadoop-rel-release-3.3.5/hadoop-hdfs-project/hadoop-hdfs-native-client/target'
    8 [WARNING] make[2]: Leaving directory '/build/hadoop/src/hadoop-rel-release-3.3.5/hadoop-hdfs-project/hadoop-hdfs-native-client/target'
    7 [WARNING] make[1]: Leaving directory '/build/hadoop/src/hadoop-rel-release-3.3.5/hadoop-hdfs-project/hadoop-hdfs-native-client/target'
    6 [WARNING] In file included from /usr/include/absl/base/config.h:86,
    5 [WARNING]                  from /usr/include/absl/base/attributes.h:37,
    4 [WARNING]                  from /usr/include/absl/strings/string_view.h:39,
    3 [WARNING]                  from /usr/include/google/protobuf/compiler/code_generator.h:45,
    2 [WARNING]                  from /build/hadoop/src/hadoop-rel-release-3.3.5/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/proto/protoc_gen_hrpc.cc:21:
    1 [WARNING] /usr/include/absl/base/policy_checks.h:79:2: error: #error "C++ versions less than C++14 are not supported."
1897  [WARNING]    79 | #error "C++ versions less than C++14 are not supported."

Adding -std=c++14 to CXXFLAGS doesn't seem to help. Please advise.

gnaggnoyil commented on 2023-06-18 18:29 (UTC)

I'm facing the same issue @dom encountered. I think it's exactly the issue described in https://issues.apache.org/jira/browse/MINIFICPP-2116 . And as the upstream issue mentioned, adding -include cstdint to CXXFLAGS should be a working workaround.

dom commented on 2023-05-16 03:04 (UTC)

Getting the following failure. Unsure of how to fix.

[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.3.5:cmake-compile (cmake-compile) on project hadoop-hdfs-native-client: make failed with error code 2 -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.3.5:cmake-compile (cmake-compile) on project hadoop-hdfs-native-client: make failed with error code 2

Full log here

lllf commented on 2022-10-19 03:00 (UTC)

Can line 31 be changed to

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk

It's currently building with java 11 which creates binaries that are incompatible with java 8. Some apache utilities that use hadoop, such as hive, do not work with java 11.

gnaggnoyil commented on 2022-08-27 14:15 (UTC)

@galaxy0419 Confirmed that package can be successfully built after your update. Thanks for your work.

ttc0419 commented on 2022-08-25 07:13 (UTC)

@petronny Thanks for the log, the issue should be resolved

petronny commented on 2022-08-24 18:36 (UTC) (edited on 2022-08-24 18:37 (UTC) by petronny)

The new log with -X -e enabled can be downloaded from https://github.com/arch4edu/cactus/actions/runs/2921411343

Also, you can reproduce the error yourself by build with extra-x86_64-build/

ttc0419 commented on 2022-08-24 12:34 (UTC) (edited on 2022-08-24 12:35 (UTC) by ttc0419)

@petronny As the error message said, please use -X and -e options to generate a useful debug log. The log is not very helpful.