OSL support broken in master

Rendering the default scene in master breaks with SIGSEGV when OSL is enabled.

Commit d268a43b25f7
OS FreeBSD 12.1
OSL 1.11.7.3
oiio 2.2.6.1
llvm 9.0.1
clang 9.0.1

Works - 2.90.0 - 2.83.5 - both using same vers as above

Process 93323 stopped
* thread #17, name = 'blender', stop reason = signal SIGSEGV: invalid address (fault address: 0x0)
    frame #0: 0x00000008164dad40 libLLVM-9.so`___lldb_unnamed_symbol1929$$libLLVM-9.so + 64
libLLVM-9.so`___lldb_unnamed_symbol1929$$libLLVM-9.so:
->  0x8164dad40 <+64>: movq   (%r12,%rbx), %rsi
    0x8164dad44 <+68>: movq   %r13, %rdi
    0x8164dad47 <+71>: callq  0x818454670               ; symbol stub for: llvm::FoldingSetNodeID::AddPointer(void const*)
    0x8164dad4c <+76>: addq   $0x8, %rbx
(lldb) bt
* thread #17, name = 'blender', stop reason = signal SIGSEGV: invalid address (fault address: 0x0)
  * frame #0: 0x00000008164dad40 libLLVM-9.so`___lldb_unnamed_symbol1929$$libLLVM-9.so + 64
    frame #1: 0x00000008164e6289 libLLVM-9.so`___lldb_unnamed_symbol2054$$libLLVM-9.so + 25
    frame #2: 0x000000081633d5c4 libLLVM-9.so`llvm::FoldingSetBase::FindNodeOrInsertPos(llvm::FoldingSetNodeID const&, void*&) + 180
    frame #3: 0x00000008164da8bd libLLVM-9.so`llvm::PMTopLevelManager::findAnalysisUsage(llvm::Pass*) + 365
    frame #4: 0x00000008164daf6b libLLVM-9.so`llvm::PMTopLevelManager::schedulePass(llvm::Pass*) + 315
    frame #5: 0x0000000817001d06 libLLVM-9.so`llvm::PassManagerBuilder::populateModulePassManager(llvm::legacy::PassManagerBase&) + 486
    frame #6: 0x000000080706ce94 liboslexec.so.1.11`OSL_v1_11::pvt::LLVM_Util::setup_optimization_passes(int, bool) + 740
    frame #7: 0x0000000807049828 liboslexec.so.1.11`OSL_v1_11::pvt::BackendLLVM::initialize_llvm_group() + 136
    frame #8: 0x00000008070635eb liboslexec.so.1.11`OSL_v1_11::pvt::BackendLLVM::run() + 1627
    frame #9: 0x0000000806f2ecb3 liboslexec.so.1.11`OSL_v1_11::pvt::ShadingSystemImpl::optimize_group(OSL_v1_11::ShaderGroup&, OSL_v1_11::ShadingContext*, bool) + 883
    frame #10: 0x0000000806f2cb2f liboslexec.so.1.11`OSL_v1_11::pvt::ShadingSystemImpl::optimize_all_groups(int, int, int, bool) + 799
    frame #11: 0x0000000806f6a674 liboslexec.so.1.11`void* std::__1::__thread_proxy<std::__1::tuple<std::__1::unique_ptr<std::__1::__thread_struct, std::__1::default_delete<std::__1::__thread_struct> >, void (*)(OSL_v1_11::pvt::ShadingSystemImpl*, int, int, bool), OSL_v1_11::pvt::ShadingSystemImpl*, int, int, bool> >(void*) + 52
    frame #12: 0x0000000804662735 libthr.so.3`thread_start(curthread=0x000000083bcfb000) at thr_create.c:292:16

I couldn’t reproduce this issue on Linux, using LLVM/Clang 9.0.1 and OSL 1.11.7.3.

This seems to be deep in the OSL / LLVM code. I can’t think of changes we did that would have caused this, I guess it’s something related to compiler flags, linking order, or something indirect like that?

Any chance you can bisect to find the commit that caused this?

Sorry, it was something at my end. I guess a stale build file.