Package Details: libc++abi 6.0.0-1

Git Clone URL: (read-only)
Package Base: libc++
Description: Low level support for the LLVM C++ standard library.
Upstream URL:
Licenses: MIT, custom:University of Illinois/NCSA Open Source License
Submitter: WoefulDerelict
Maintainer: WoefulDerelict
Last Packager: WoefulDerelict
Votes: 140
Popularity: 21.137911
First Submitted: 2017-02-04 16:09
Last Updated: 2018-03-21 04:20

Pinned Comments

WoefulDerelict commented on 2017-02-05 03:42

This PKGBUILD verifies the authenticity of the source via PGP signatures which are not part of the Arch Linux keyring. In order to complete the process it is necessary to import the key(s) from the ‘validpgpkeys’ array into the user’s keyring before calling makepkg. There is a helpful article explaining this process by one of Arch Linux's developer's located here:

Instructions on importing keys from a keyserver and how to automate the retrieval process can be found in the Arch Linux wiki here: This article also contains helpful information describing the installation of GnuPG, its configuration and usage.

Execute the following to import keys using gpg:

gpg --recv-keys <KEYID - See 'validpgpkeys' array in PKGBUILD>

The PGP signature check can be skipped by passing --skippgpcheck to makepkg.

The libc++ test suite can be skipped by passing --nocheck to makepkg.

Consult the makepkg manual page for a full list of options. []

Latest Comments

xlaits commented on 2018-04-19 22:58

Getting an error trying to build this.

==> Starting build()... -- The C compiler identification is unknown -- The CXX compiler identification is unknown -- The ASM compiler identification is unknown -- Found assembler: /usr/bin/clang -- Check for working C compiler: /usr/bin/clang -- Check for working C compiler: /usr/bin/clang -- broken CMake Error at /usr/share/cmake-3.10/Modules/CMakeTestCCompiler.cmake:52 (message): The C compiler


is not able to compile a simple test program.

It fails with the following output:

Change Dir: /home/xlaits/AUR/libc++/src/build/CMakeFiles/CMakeTmp

Run Build Command:"/usr/bin/ninja" "cmTC_530a5"
[1/2] Building C object CMakeFiles/cmTC_530a5.dir/testCCompiler.c.o
FAILED: CMakeFiles/cmTC_530a5.dir/testCCompiler.c.o 
/usr/bin/clang   -march=x86-64 -mtune=generic -O2 -pipe -fstack-protector-strong -fno-plt -o CMakeFiles/cmTC_530a5.dir/testCCompiler.c.o   -c testCCompiler.c
/usr/bin/clang: error while loading shared libraries: cannot open shared object file: No such file or directory
ninja: build stopped: subcommand failed.

CMake will not be able to correctly generate this project. Call Stack (most recent call first): CMakeLists.txt:45 (project)

-- Configuring incomplete, errors occurred! See also "/home/xlaits/AUR/libc++/src/build/CMakeFiles/CMakeOutput.log". See also "/home/xlaits/AUR/libc++/src/build/CMakeFiles/CMakeError.log". ==> ERROR: A failure occurred in build(). Aborting...

WoefulDerelict commented on 2018-03-26 18:10

maxlefou: It would take a significant amount of time to run through the test suite with just two threads. How long is it hanging at the test progress bar without changing? You can pass --nocheck to makepkg to skip the tests.

Morganamilo commented on 2018-03-26 18:04

@maxlefou how long is forever?

And you can just pass --nocheck to makepkg.

WoefulDerelict commented on 2018-03-26 17:31

zakimano: Thank you very much for so thoroughly testing the build on your system. The tests will execute in parallel if the resources are available. This generally saves a great deal of time but it will use more memory. It is entirely possible if a number of memory hungry tests ran in parallel it would cause a significant spike in memory usage.

My own tests were run in a VM where the resources were scaled back so fewer tests were able to run in parallel. Decreasing the number of available cores had a direct effect on the memory usage and the amount of time it took for the tests to complete.

Having built the entire LLVM chain and monsters like qt4, qtwebkit and on occasion chromium, I barely notice something like libc++. Yaourt tends to cause the most issues with large, memory hungry builds because of its default configuration. When tracking down issues it is always best to eliminate extraneous variables first and AUR helpers happen to be the easiest target. makepkg and a clean chroot are the best fallback when one encounters build issues.

I'll add a note in the pinned comment about --nocheck for the impatient or resource strapped.

zakimano commented on 2018-03-23 15:11

TL;DR (Mostly for newcomers): Yaourt should be avoided, and if you cannot complete tests on this, turn them off for now, as seen in the comments below.

WoefulDerelict: Thanks for the effort, and pointing out that yaourt should be avoided - it really should. I re-ran the whole build process with just makepkg, this time the tests could actually complete, however I still experienced a higher (~ 2GiB) peak memory usage than what you did, investigating the issue now...

Getting the llvm / libc++ / libc++abi from the provider - - and following their guide to build and test libc++, I was able to reduce the issue to the std/input.output/stream.buffers/ group of tests. For further investigation I had to use a tool ( to kill the process as soon as it crosses the 2 GiB line - and with that I could pinpoint exactly; the culprit is the:


Exact command, according to the build & test guide @ lit -sv test/std/input.output/stream.buffers/streambuf/streambuf.protected/streambuf.put.area/pbump2gig.pass.cpp I ran it like this: timeout -m 2100000 lit -sv test/std/input.output/stream.buffers/streambuf/streambuf.protected/streambuf.put.area/pbump2gig.pass.cpp

Which, as it's name states, is a test for strings about 2 GiB in size.

Assuming that the timeout tool I used knows what it's doing (it seems to be) this test, even when I set the memory limit to 3GiB, breaks that limit (gets shut down by the tool). This behavior might be just for a moment though - but it was enough to make my weak-ass laptop crash and burn when I tried to build the package with yaourt.

[edit] Valgrind information is irrelevant, as the timeout tool kills the process before it could complete. It seems that I cannot run this test on my own machine with diagnostics tools enabled. [/edit]

As for the exact cause of this issue, I cannot provide more meaningful information without help from someone who understands C++ / Unittests better than me.

Ps.: Sorry for the late (and long) reply, I didn't want to post before I knew and understood as best as I can what was going on.

[edit 2]

Update: If I set the timeout script to a threshold of about 5GiB, that way the whole test can run, and complete. The output it gives looks like this: FINISHED CPU 6.83 MEM 333676 MAXMEM 4657000 STALE 6 MAXMEM_RSS 3417208 Where MAXMEM is the maximum amount of memory used - clearly a tad higher than the 2 GiB I was expecting. But I think this is probably for just a few miliseconds.

WoefulDerelict commented on 2018-03-22 01:25

zakimano: Thanks for the update. If you have a moment to retry the build using makepkg and see if it suffers from the same issue I would appreciate it. My own test system has an abundance of RAM and is based on ZFS so it may mask a spike in memory usage during the tests. I'll fire up a VM when I get a free moment and see if I can't get a clearer picture of test suite's memory usage.

Update: Having completed the build and test suite on a memory restricted virtual machine one can confirm the process neither populates the entirety of a system's RAM and swap space nor does it cause the system to become unresponsive. The VM was configured with 4 cores and 2 GB of RAM and swap. The process ran for about 25 minutes with an average memory footprint of 400 MB. The highest spike I observed was 900 MB. While this might pose an issue on a Raspberry Pi I would wager even those would be able to complete the test suite with a modest amount of swap space.

zakimano commented on 2018-03-21 23:30

Ye, sorry about that.

I too realized the pointlessness of that comment, gonna leave it as is, since I was stupid enough to write it. But I re-tried, just to see if this, from yaourt-manpage: --tmp <dir> Use <dir> as temporary folder. default to /tmp Would change anything. And even with that set, (yaourt -Sy libc++abi --tmp ~/yaourtbuild/) I get the same behaviour, it just fills up the ram when performing tests.

The main takeaway here is that I'll stay far away from yaourt from now on, and continue to look into this issue.

WoefulDerelict commented on 2018-03-21 20:55

zakimano: Your problem stems not from the tests but from yaourt and its ridiculous behaviour. yaourt conducts the entire build in /tmp which is volatile and resides entirely in memory by default []. This behaviour is responsible for the issue you encountered and not the libc++ test suite. It would be wise to consult the documentation before bothering maintainers with inane comments.

zakimano commented on 2018-03-21 20:20

Thanks for the quick replies.

Running yaourt -Sy libc++abi --m-arg --nocheck did the trick.

As for the necessity of the tests, I agree, it's a good thing to test the packages - but at the same time, I can't agree to tests that are unsafe.

By 'unsafe' I mean that they can very potentially cause a system to hard-freeze, requiring manual restart, or similar. In my case, I have a laptop with an i5-5200U and 4 GiB of RAM. So it has rather common specs, which makes it a good example of what kind of machines others might use out there. And the tests on this machine filled up all of the avaiable RAM, then started to fill up the SWAP. I could follow it until it hit the 2 GiB mark into the SWAP - I don't know how far it got, but I suspect a good 5-6 GiBs of memory is required to run the tests, at the very least. Obviously I had to manually restart the machine, as it was completely frozen. This also means that these tests could easily break stuff when run in a production environment, for example.

Now that is something that shouldn't be on a stable channel package. Either use / write tests that detect system specs, and keep themselves on a scale that fits the current system, or simply use tests that operate on a scale that no machine today would have problems running them.

WoefulDerelict commented on 2018-03-21 16:50

zakimano: To skip the check() function simply pass the --nocheck option to makepkg.

Griever: check() functions are present in many Arch PKGBUILDS and are a recommended practice []. Users who don't want to deal with the overhead can disable the tests easily just as one can skip the process of verifying the source via PGP signatures.

On an i7-870 from the later half of 2009 the process of building and testing libc++ takes just over 15 minutes.

All comments