Closed
Description
Trying to cross-compile the following sample program:
#include <thread>
#include <future>
#include <iostream>
int main(){
auto future = std::async(std::launch::async, [](){
std::cout << "I'm a thread" << std::endl;
});
future.get();
return 0;
}
Trying to compile this code using the M68k backend results in an internal compiler error:
glaubitz@node54:/data/home/glaubitz> /data/home/glaubitz/llvm-project/stage1.install/bin/clang -target m68k-linux-gnu future.cc -o future.o -I /data/home/glaubitz/sid-m68k-sbuild/usr/m68k-linux-gnu/include/c++/12/m68k-linux-gnu/ -I /data/home/glaubitz/sid-m68k-sbuild/usr/m68k-linux-gnu/include/ -I /data/home/glaubitz/sid-m68k-sbuild/usr/include/c++/12/ -I /data/home/glaubitz/sid-m68k-sbuild/usr/include/m68k-linux-gnu/c++/12/ -c
Should not custom lower this!
UNREACHABLE executed at /data/home/glaubitz/llvm-project/llvm/lib/Target/M68k/M68kISelLowering.cpp:1353!
PLEASE submit a bug report to https://github.com/llvm/llvm-project/issues/ and include the crash backtrace, preprocessed source, and associated run script.
Stack dump:
0. Program arguments: /data/home/glaubitz/llvm-project/stage1.install/bin/clang -target m68k-linux-gnu future.cc -o future.o -I /data/home/glaubitz/sid-m68k-sbuild/usr/m68k-linux-gnu/include/c++/12/m68k-linux-gnu/ -I /data/home/glaubitz/sid-m68k-sbuild/usr/m68k-linux-gnu/include/ -I /data/home/glaubitz/sid-m68k-sbuild/usr/include/c++/12/ -I /data/home/glaubitz/sid-m68k-sbuild/usr/include/m68k-linux-gnu/c++/12/ -c
1. <eof> parser at end of file
2. Code generation
3. Running pass 'Function Pass Manager' on module 'future.cc'.
4. Running pass 'M68k DAG->DAG Pattern Instruction Selection' on function '@_ZNSt9once_flag18_Prepare_executionC2IZSt9call_onceIMNSt13__future_base13_State_baseV2EFvPSt8functionIFSt10unique_ptrINS3_12_Result_baseENS7_8_DeleterEEvEEPbEJPS4_SC_SD_EEvRS_OT_DpOT0_EUlvE_EERSI_'
#0 0x0000000001f5cb9f PrintStackTraceSignalHandler(void*) Signals.cpp:0:0
#1 0x0000000001f5a754 llvm::sys::CleanupOnSignal(unsigned long) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x1f5a754)
#2 0x0000000001eab210 CrashRecoverySignalHandler(int) CrashRecoveryContext.cpp:0:0
#3 0x00007f0e429e48c0 __restore_rt (/lib64/libpthread.so.0+0x168c0)
#4 0x00007f0e4112acbb raise (/lib64/libc.so.6+0x4acbb)
#5 0x00007f0e4112c355 abort (/lib64/libc.so.6+0x4c355)
#6 0x0000000001eb419a (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x1eb419a)
#7 0x0000000000f002f9 (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0xf002f9)
#8 0x0000000002dfef40 (anonymous namespace)::SelectionDAGLegalize::LegalizeOp(llvm::SDNode*) (.part.412) LegalizeDAG.cpp:0:0
#9 0x0000000002e0c598 llvm::SelectionDAG::Legalize() (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x2e0c598)
#10 0x0000000002ecb861 llvm::SelectionDAGISel::CodeGenAndEmitDAG() (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x2ecb861)
#11 0x0000000002ecf22d llvm::SelectionDAGISel::SelectAllBasicBlocks(llvm::Function const&) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x2ecf22d)
#12 0x0000000002ed0d41 llvm::SelectionDAGISel::runOnMachineFunction(llvm::MachineFunction&) (.part.1088) SelectionDAGISel.cpp:0:0
#13 0x0000000001304db2 llvm::MachineFunctionPass::runOnFunction(llvm::Function&) (.part.70) MachineFunctionPass.cpp:0:0
#14 0x000000000185a5c8 llvm::FPPassManager::runOnFunction(llvm::Function&) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x185a5c8)
#15 0x000000000185a8e9 llvm::FPPassManager::runOnModule(llvm::Module&) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x185a8e9)
#16 0x000000000185b748 llvm::legacy::PassManagerImpl::run(llvm::Module&) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x185b748)
#17 0x00000000022c3ef3 clang::EmitBackendOutput(clang::DiagnosticsEngine&, clang::HeaderSearchOptions const&, clang::CodeGenOptions const&, clang::TargetOptions const&, clang::LangOptions const&, llvm::StringRef, llvm::Module*, clang::BackendAction, std::unique_ptr<llvm::raw_pwrite_stream, std::default_delete<llvm::raw_pwrite_stream>>) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x22c3ef3)
#18 0x0000000002fff820 clang::BackendConsumer::HandleTranslationUnit(clang::ASTContext&) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x2fff820)
#19 0x0000000003ce5ca9 clang::ParseAST(clang::Sema&, bool, bool) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x3ce5ca9)
#20 0x0000000002ffe530 clang::CodeGenAction::ExecuteAction() (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x2ffe530)
#21 0x00000000029c0d19 clang::FrontendAction::Execute() (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x29c0d19)
#22 0x000000000295b23a clang::CompilerInstance::ExecuteAction(clang::FrontendAction&) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x295b23a)
#23 0x0000000002a8e1fb clang::ExecuteCompilerInvocation(clang::CompilerInstance*) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x2a8e1fb)
#24 0x0000000000b30701 cc1_main(llvm::ArrayRef<char const*>, char const*, void*) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0xb30701)
#25 0x0000000000b2bef8 ExecuteCC1Tool(llvm::SmallVectorImpl<char const*>&) driver.cpp:0:0
#26 0x00000000027ecab5 void llvm::function_ref<void ()>::callback_fn<clang::driver::CC1Command::Execute(llvm::ArrayRef<llvm::Optional<llvm::StringRef>>, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char>>*, bool*) const::'lambda'()>(long) Job.cpp:0:0
#27 0x0000000001eab973 llvm::CrashRecoveryContext::RunSafely(llvm::function_ref<void ()>) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x1eab973)
#28 0x00000000027efcbe clang::driver::CC1Command::Execute(llvm::ArrayRef<llvm::Optional<llvm::StringRef>>, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char>>*, bool*) const (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x27efcbe)
#29 0x00000000027befd3 clang::driver::Compilation::ExecuteCommand(clang::driver::Command const&, clang::driver::Command const*&, bool) const (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x27befd3)
#30 0x00000000027bfb83 clang::driver::Compilation::ExecuteJobs(clang::driver::JobList const&, llvm::SmallVectorImpl<std::pair<int, clang::driver::Command const*>>&, bool) const (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x27bfb83)
#31 0x00000000027c610c clang::driver::Driver::ExecuteCompilation(clang::driver::Compilation&, llvm::SmallVectorImpl<std::pair<int, clang::driver::Command const*>>&) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0x27c610c)
#32 0x0000000000b2ea2c clang_main(int, char**) (/data/home/glaubitz/llvm-project/stage1.install/bin/clang+0xb2ea2c)
#33 0x00007f0e4111529d __libc_start_main (/lib64/libc.so.6+0x3529d)
#34 0x0000000000b26d8a _start /home/abuild/rpmbuild/BUILD/glibc-2.31/csu/../sysdeps/x86_64/start.S:122:0
clang-16: error: clang frontend command failed with exit code 134 (use -v to see invocation)
clang version 16.0.0 (https://github.com/llvm/llvm-project.git 907baeec49bfbe9e76498634a9418e1dc6c973d9)
Target: m68k-unknown-linux-gnu
Thread model: posix
InstalledDir: /data/home/glaubitz/llvm-project/stage1.install/bin
clang-16: note: diagnostic msg:
********************
PLEASE ATTACH THE FOLLOWING FILES TO THE BUG REPORT:
Preprocessed source(s) and associated run script(s) are located at:
clang-16: note: diagnostic msg: /tmp/future-96ab84.cpp
clang-16: note: diagnostic msg: /tmp/future-96ab84.sh
clang-16: note: diagnostic msg:
********************
glaubitz@node54:/data/home/glaubitz>
Attaching the preprocessed source plus run script in a zipped archive.
Metadata
Metadata
Assignees
Type
Projects
Milestone
Relationships
Development
No branches or pull requests
Activity
llvmbot commentedon Nov 23, 2022
@llvm/issue-subscribers-backend-m68k
0x59616e commentedon Jan 8, 2023
My initial observation is that we have no support for
GlobalTLSAddress
. I'm currently looing into this. It will be appreciated if anyone can share your knowledge in the TLS support of M68k.glaubitz commentedon Jan 8, 2023
I can only share that TLS support was lacking for a long time in glibc on m68k and was only added later when ColdFire was developed. I have asked your question on the m68k-specific Debian and Linux kernel mailing lists, there are definitely people who can answer your question(s).
0x59616e commentedon Jan 8, 2023
Appreciated ;)
glaubitz commentedon Jan 8, 2023
TLS support for m68k was introduced in gcc in gcc-mirror/gcc@75df395.
Maybe this commit can help adding TLS support to LLVM for m68k?
glaubitz commentedon Jan 8, 2023
And we don't have TLS support in the backend yet (see: https://github.com/llvm/llvm-project/blob/main/llvm/lib/Target/M68k/M68kISelDAGToDAG.cpp#L423).
So, I guess adding TLS support will be the next bigger goal for the backend.
aslak3 commentedon Jan 30, 2023
I appreciate that this horse has bolted, but I'd be interested in anyone's thoughts on the performance of the TLS m68k implementation. In general, you see at least a 50% hit, as documented here:
https://news.ycombinator.com/item?id=30007986
And observed on my own 68030 based homebrew board. I'm compelled to run pre TLS (2008-ish) debian, with the latest HEAD kernel, because of the perf hit caused by TLS. The biggest frustration is it impacts non threaded applications (eg, /bin/uname) as well. It's jolly frustrating that this work towards a more complete 68k Linux userland is only really useful for virtualised "machines" (even then the perf hit is the same). Unfortunately I wasn't really following 68k developments back when these choices were made.
Apologies for storming on about something not technically related.
chewi commentedon Jan 30, 2023
@aslak3 That's okay, I was wondering the same thing. I know there are no free registers, but I was naively wondering whether re-purposing one of them would be less of a performance hit, given that we are looking to break ABI anyway. I only have a vague understanding of assembly though, so I can't offer anything useful here.
glaubitz commentedon Jan 30, 2023
Well, without TLS, we simply won't be able to compile any modern code. So I don't think we really have a choice.
If we're going to be stuck to old code, there is no point in investing into further development, is there?
Not really. There is faster hardware available these days such as the Apollo accelerators or the PiStorm.
Yes, it gets slower on older 68k machines, but again, what's would be the point to put more effort into the effort when you can't modern code.
FWIW, we have a Discord for m68k where these issues can be discussed.
See: https://discord.gg/x4mrPNFy
10 remaining items