UIUC-ChenLab / ScaleHLS-HIDA

Other
26 stars 0 forks source link

scalehls regression tests unexpected failures #2

Open tafk7 opened 6 months ago

tafk7 commented 6 months ago

I'm getting two unexpected failures when attempting to build scalehls:

[2368/2369] Running the scalehls regression tests
FAIL: SCALEHLS :: Transforms/Dataflow/place-dataflow-buffer.mlir (45 of 58)
******************** TEST 'SCALEHLS :: Transforms/Dataflow/place-dataflow-buffer.mlir' FAILED ********************
Script:
--
: 'RUN: at line 1';   /home/user/ScaleHLS-HIDA/build/bin/scalehls-opt -scalehls-place-dataflow-buffer /home/user/ScaleHLS-HIDA/test/Transforms/Dataflow/place-dataflow-buffer.mlir | /home/user/ScaleHLS-HIDA/build/bin/FileCheck /home/user/ScaleHLS-HIDA/test/Transforms/Dataflow/place-dataflow-buffer.mlir
--
Exit Code: 1

Command Output (stderr):
--
/home/user/ScaleHLS-HIDA/test/Transforms/Dataflow/place-dataflow-buffer.mlir:4:11: error: CHECK: expected string not found in input
// CHECK: func.func @forward(%arg0: memref<1x64x56x56xi8, #hls.mem<dram>>, %arg1: memref<1000x64xi8, #hls.mem<dram>>, %arg2: memref<64x64x1x1xi8, #hls.mem<dram>>, %arg3: memref<64x64x3x3xi8, #hls.mem<dram>>, %arg4: memref<64x64x3x3xi8, #hls.mem<dram>>, %arg5: memref<1x1000xi8, #hls.mem<dram>>) attributes {top_func} {
          ^
<stdin>:1:57: note: scanning from here
module attributes {torch.debug_module_name = "ResNet"} {
                                                        ^
<stdin>:2:2: note: possible intended match here
 func.func @forward(%arg0: memref<1x64x56x56xi8, #hls.mem<dram>>, %arg1: memref<1000x64xi8, #hls.mem<dram>>, %arg2: memref<64x64x1x1xi8, #hls.mem<dram>>, %arg3: memref<64x64x3x3xi8, #hls.mem<dram>>, %arg4: memref<64x64x3x3xi8, #hls.mem<dram>>, %arg5: memref<1x1000xi8, #hls.mem<bram_t2p>>) attributes {top_func} {
 ^

Input file: <stdin>
Check file: /home/user/ScaleHLS-HIDA/test/Transforms/Dataflow/place-dataflow-buffer.mlir

-dump-input=help explains the following input dump.

Input was:
<<<<<<
           1: module attributes {torch.debug_module_name = "ResNet"} {
check:4'0                                                             X error: no match found
           2:  func.func @forward(%arg0: memref<1x64x56x56xi8, #hls.mem<dram>>, %arg1: memref<1000x64xi8, #hls.mem<dram>>, %arg2: memref<64x64x1x1xi8, #hls.mem<dram>>, %arg3: memref<64x64x3x3xi8, #hls.mem<dram>>, %arg4: memref<64x64x3x3xi8, #hls.mem<dram>>, %arg5: memref<1x1000xi8, #hls.mem<bram_t2p>>) attributes {top_func} {
check:4'0     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~check:4'1      ? possible intended match
           3:  %c-24_i8 = arith.constant -24 : i8
check:4'0     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
           4:  hls.dataflow.dispatch {
check:4'0     ~~~~~~~~~~~~~~~~~~~~~~~~~
           5:  hls.dataflow.task {
check:4'0     ~~~~~~~~~~~~~~~~~~~~~
           6:  affine.for %arg6 = 0 to 64 {
check:4'0     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
           7:  affine.for %arg7 = 0 to 56 {
check:4'0     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
           .
           .
           .
>>>>>>

--

********************
FAIL: SCALEHLS :: Transforms/Directive/array-partition.mlir (57 of 58)
******************** TEST 'SCALEHLS :: Transforms/Directive/array-partition.mlir' FAILED ********************
Script:
--
: 'RUN: at line 1';   /home/user/ScaleHLS-HIDA/build/bin/scalehls-opt -scalehls-array-partition /home/user/ScaleHLS-HIDA/test/Transforms/Directive/array-partition.mlir | /home/user/ScaleHLS-HIDA/build/bin/FileCheck /home/user/ScaleHLS-HIDA/test/Transforms/Directive/array-partition.mlir
--
Exit Code: 1

Command Output (stderr):
--
/home/user/ScaleHLS-HIDA/test/Transforms/Directive/array-partition.mlir:233:11: error: CHECK: expected string not found in input
// CHECK: func.func @forward_node17(%arg0: memref<16x14x14xi8, #hls.partition<[none, cyclic, cyclic], [1, 2, 2]>, #hls.mem<bram_t2p>>, %arg1: memref<64x28x28xi8, #hls.mem<dram>>, %arg2: index, %arg3: index, %arg4: index) attributes {inline} {
          ^
<stdin>:230:3: note: scanning from here
 }
  ^
<stdin>:231:2: note: possible intended match here
 func.func @forward_node17(%arg0: memref<16x14x14xi8, #hls.partition<[none, cyclic, cyclic], [1, 2, 2]>, #hls.mem<lutram_2p>>, %arg1: memref<64x28x28xi8, #hls.mem<dram>>, %arg2: index, %arg3: index, %arg4: index) attributes {inline} {
 ^

Input file: <stdin>
Check file: /home/user/ScaleHLS-HIDA/test/Transforms/Directive/array-partition.mlir

-dump-input=help explains the following input dump.

Input was:
<<<<<<
             .
             .
             .
           225:  func.call @forward_node10(%9, %arg8, %2, %1, %0) : (memref<16x14x14xi8, #hls.mem<bram_t2p>>, memref<64x28x28xi8, #hls.mem<dram>>, index, index, index) -> ()
           226:  func.call @forward_node9(%4, %arg7, %2, %1, %0) : (memref<16x14x14xi8, #hls.mem<bram_t2p>>, memref<64x28x28xi8, #hls.mem<dram>>, index, index, index) -> ()
           227:  } {loop_directive = #hls.loop<pipeline = false, target_ii = 1, dataflow = true, flatten = false>}
           228:  hls.dataflow.stream_write %arg6, %true : <i1, 1>, i1
           229:  return
           230:  }
check:233'0       X error: no match found
           231:  func.func @forward_node17(%arg0: memref<16x14x14xi8, #hls.partition<[none, cyclic, cyclic], [1, 2, 2]>, #hls.mem<lutram_2p>>, %arg1: memref<64x28x28xi8, #hls.mem<dram>>, %arg2: index, %arg3: index, %arg4: index) attributes {inline} {
check:233'0     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
check:233'1      ?  possible intended match
           232:  affine.for %arg5 = 0 to 16 {
check:233'0     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
           233:  affine.for %arg6 = 0 to 14 step 2 {
check:233'0     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
           234:  affine.for %arg7 = 0 to 14 step 2 {
check:233'0     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
           235:  %0 = affine.load %arg0[%arg5, %arg6, %arg7] : memref<16x14x14xi8, #hls.partition<[none, cyclic, cyclic], [1, 2, 2]>, #hls.mem<lutram_2p>>
check:233'0     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
           236:  affine.store %0, %arg1[%arg5 + symbol(%arg2) * 16, %arg6 + symbol(%arg3) * 14, %arg7 + symbol(%arg4) * 14] : memref<64x28x28xi8, #hls.mem<dram>>
check:233'0     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
             .
             .
             .
>>>>>>

--

********************
********************
Failed Tests (2):
  SCALEHLS :: Transforms/Dataflow/place-dataflow-buffer.mlir
  SCALEHLS :: Transforms/Directive/array-partition.mlir

Testing Time: 3.18s
  Unsupported      :  2
  Passed           : 41
  Expectedly Failed: 13
  Failed           :  2
FAILED: tools/scalehls/test/CMakeFiles/check-scalehls /home/user/ScaleHLS-HIDA/build/tools/scalehls/test/CMakeFiles/check-scalehls
cd /home/user/ScaleHLS-HIDA/build/tools/scalehls/test && /usr/bin/python3.10 /home/user/ScaleHLS-HIDA/build/bin/llvm-lit -sv /home/user/ScaleHLS-HIDA/build/tools/scalehls/test
ninja: build stopped: subcommand failed.

Are these expected? Any guidance would be greatly appreciated.

tafk7 commented 6 months ago

Follow up, attempting to run scalehls-opt on the resnet18 sample, I get this error:

(torch-mlir) user@system:~/ScaleHLS-HIDA/samples/pytorch/resnet18$ scalehls-opt resnet18.mlir \
>     -scaleflow-pytorch-pipeline="top-func=forward loop-tile-size=8 loop-unroll-factor=4" \
>     | scalehls-translate -scalehls-emit-hlscpp > resnet18.cpp
scalehls-opt: /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/include/llvm/ADT/SmallVector.h:303: reference llvm::SmallVectorTemplateCommon<std::tuple<long, long, bool, long>>::front() [T = std::tuple<long, long, bool, long>]: Assertion `!empty()' failed.
PLEASE submit a bug report to https://github.com/llvm/llvm-project/issues/ and include the crash backtrace.
Stack dump:
0.      Program arguments: scalehls-opt resnet18.mlir "-scaleflow-pytorch-pipeline=top-func=forward loop-tile-size=8 loop-unroll-factor=4"
 #0 0x0000555c9d1b61ea llvm::sys::PrintStackTrace(llvm::raw_ostream&, int) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/lib/Support/Unix/Signals.inc:569:11
 #1 0x0000555c9d1b639b PrintStackTraceSignalHandler(void*) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/lib/Support/Unix/Signals.inc:636:1
 #2 0x0000555c9d1b4a16 llvm::sys::RunSignalHandlers() /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/lib/Support/Signals.cpp:104:5
 #3 0x0000555c9d1b6a85 SignalHandler(int) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/lib/Support/Unix/Signals.inc:407:1
 #4 0x00007f7256242520 (/lib/x86_64-linux-gnu/libc.so.6+0x42520)
 #5 0x00007f72562969fc pthread_kill (/lib/x86_64-linux-gnu/libc.so.6+0x969fc)
 #6 0x00007f7256242476 gsignal (/lib/x86_64-linux-gnu/libc.so.6+0x42476)
 #7 0x00007f72562287f3 abort (/lib/x86_64-linux-gnu/libc.so.6+0x287f3)
 #8 0x00007f725622871b (/lib/x86_64-linux-gnu/libc.so.6+0x2871b)
 #9 0x00007f7256239e96 (/lib/x86_64-linux-gnu/libc.so.6+0x39e96)
#10 0x0000555c9c104c7c llvm::SmallVectorTemplateCommon<std::tuple<long, long, bool, long>, void>::front() /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/include/llvm/ADT/SmallVector.h:0:5
#11 0x0000555c9c102632 getBufferIndexDepthsAndStrides(mlir::scalehls::hls::NodeOp, mlir::Value) /home/user/ScaleHLS-HIDA/lib/Dialect/HLS/Analysis.cpp:173:45
#12 0x0000555c9c101924 mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0::operator()(mlir::scalehls::hls::BufferLikeInterface) const /home/user/ScaleHLS-HIDA/lib/Dialect/HLS/Analysis.cpp:221:15
#13 0x0000555c9c1016a3 std::enable_if<!llvm::is_one_of<mlir::scalehls::hls::BufferLikeInterface, mlir::Operation*, mlir::Region*, mlir::Block*>::value && std::is_same<mlir::WalkResult, mlir::WalkResult>::value, mlir::WalkResult>::type mlir::detail::walk<(mlir::WalkOrder)1, mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0, mlir::scalehls::hls::BufferLikeInterface, mlir::WalkResult>(mlir::Operation*, mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0&&)::'lambda'(mlir::Operation*)::operator()(mlir::Operation*) const /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/../mlir/include/mlir/IR/Visitors.h:230:14
#14 0x0000555c9c10162d mlir::WalkResult llvm::function_ref<mlir::WalkResult (mlir::Operation*)>::callback_fn<std::enable_if<!llvm::is_one_of<mlir::scalehls::hls::BufferLikeInterface, mlir::Operation*, mlir::Region*, mlir::Block*>::value && std::is_same<mlir::WalkResult, mlir::WalkResult>::value, mlir::WalkResult>::type mlir::detail::walk<(mlir::WalkOrder)1, mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0, mlir::scalehls::hls::BufferLikeInterface, mlir::WalkResult>(mlir::Operation*, mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0&&)::'lambda'(mlir::Operation*)>(long, mlir::Operation*) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/include/llvm/ADT/STLFunctionalExtras.h:45:12
#15 0x0000555c9d01e1d1 llvm::function_ref<mlir::WalkResult (mlir::Operation*)>::operator()(mlir::Operation*) const /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/include/llvm/ADT/STLFunctionalExtras.h:68:12
#16 0x0000555c9d01de20 mlir::detail::walk(mlir::Operation*, llvm::function_ref<mlir::WalkResult (mlir::Operation*)>, mlir::WalkOrder) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/IR/Visitors.cpp:181:12
#17 0x0000555c9d01dd9e mlir::detail::walk(mlir::Operation*, llvm::function_ref<mlir::WalkResult (mlir::Operation*)>, mlir::WalkOrder) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/IR/Visitors.cpp:174:13
#18 0x0000555c9d01dd9e mlir::detail::walk(mlir::Operation*, llvm::function_ref<mlir::WalkResult (mlir::Operation*)>, mlir::WalkOrder) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/IR/Visitors.cpp:174:13
#19 0x0000555c9c1015ca std::enable_if<!llvm::is_one_of<mlir::scalehls::hls::BufferLikeInterface, mlir::Operation*, mlir::Region*, mlir::Block*>::value && std::is_same<mlir::WalkResult, mlir::WalkResult>::value, mlir::WalkResult>::type mlir::detail::walk<(mlir::WalkOrder)1, mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0, mlir::scalehls::hls::BufferLikeInterface, mlir::WalkResult>(mlir::Operation*, mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0&&) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/../mlir/include/mlir/IR/Visitors.h:233:10
#20 0x0000555c9c10156d std::enable_if<llvm::function_traits<std::decay<mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0>::type>::num_args == 1, mlir::WalkResult>::type mlir::Operation::walk<(mlir::WalkOrder)1, mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0, mlir::WalkResult>(mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0&&) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/../mlir/include/mlir/IR/Operation.h:575:12
#21 0x0000555c9c1012f0 std::enable_if<llvm::function_traits<std::decay<mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0>::type>::num_args == 1, mlir::WalkResult>::type mlir::OpState::walk<(mlir::WalkOrder)1, mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0, mlir::WalkResult>(mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp)::$_0&&) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/../mlir/include/mlir/IR/OpDefinition.h:153:19
#22 0x0000555c9c1012c4 mlir::scalehls::CorrelationAnalysis::CorrelationAnalysis(mlir::func::FuncOp) /home/user/ScaleHLS-HIDA/lib/Dialect/HLS/Analysis.cpp:206:8
#23 0x0000555c9c0f2b4c (anonymous namespace)::ParallelizeDataflowNode::applyCorrelationAwareUnroll(mlir::func::FuncOp) /home/user/ScaleHLS-HIDA/lib/Transforms/Dataflow/ParallelizeDataflowNode.cpp:198:46
#24 0x0000555c9c0f2707 (anonymous namespace)::ParallelizeDataflowNode::runOnOperation() /home/user/ScaleHLS-HIDA/lib/Transforms/Dataflow/ParallelizeDataflowNode.cpp:325:7
#25 0x0000555c9cd17dca mlir::detail::OpToOpPassAdaptor::run(mlir::Pass*, mlir::Operation*, mlir::AnalysisManager, bool, unsigned int) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Pass/Pass.cpp:462:21
#26 0x0000555c9cd183a4 mlir::detail::OpToOpPassAdaptor::runPipeline(mlir::OpPassManager&, mlir::Operation*, mlir::AnalysisManager, bool, unsigned int, mlir::PassInstrumentor*, mlir::PassInstrumentation::PipelineParentInfo const*) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Pass/Pass.cpp:525:16
#27 0x0000555c9cd1d4a5 mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::$_0::operator()(mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo&) const /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Pass/Pass.cpp:745:36
#28 0x0000555c9cd1d129 mlir::LogicalResult mlir::failableParallelForEach<__gnu_cxx::__normal_iterator<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo*, std::vector<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo, std::allocator<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo>>>, mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::$_0&>(mlir::MLIRContext*, __gnu_cxx::__normal_iterator<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo*, std::vector<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo, std::allocator<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo>>>, __gnu_cxx::__normal_iterator<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo*, std::vector<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo, std::allocator<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo>>>, mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::$_0&) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/include/mlir/IR/Threading.h:46:18
#29 0x0000555c9cd195eb mlir::LogicalResult mlir::failableParallelForEach<std::vector<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo, std::allocator<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo>>&, mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::$_0&>(mlir::MLIRContext*, std::vector<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo, std::allocator<mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::OpPMInfo>>&, mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool)::$_0&) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/include/mlir/IR/Threading.h:92:10
#30 0x0000555c9cd18ef8 mlir::detail::OpToOpPassAdaptor::runOnOperationAsyncImpl(bool) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Pass/Pass.cpp:755:14
#31 0x0000555c9cd18057 mlir::detail::OpToOpPassAdaptor::runOnOperation(bool) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Pass/Pass.cpp:646:5
#32 0x0000555c9cd17dbb mlir::detail::OpToOpPassAdaptor::run(mlir::Pass*, mlir::Operation*, mlir::AnalysisManager, bool, unsigned int) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Pass/Pass.cpp:459:5
#33 0x0000555c9cd183a4 mlir::detail::OpToOpPassAdaptor::runPipeline(mlir::OpPassManager&, mlir::Operation*, mlir::AnalysisManager, bool, unsigned int, mlir::PassInstrumentor*, mlir::PassInstrumentation::PipelineParentInfo const*) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Pass/Pass.cpp:525:16
#34 0x0000555c9cd19bb8 mlir::PassManager::runPasses(mlir::Operation*, mlir::AnalysisManager) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Pass/Pass.cpp:828:10
#35 0x0000555c9cd19aef mlir::PassManager::run(mlir::Operation*) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Pass/Pass.cpp:808:60
#36 0x0000555c9bfcdff2 performActions(llvm::raw_ostream&, bool, bool, llvm::SourceMgr&, mlir::MLIRContext*, llvm::function_ref<mlir::LogicalResult (mlir::PassManager&)>, bool, bool) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Tools/mlir-opt/MlirOptMain.cpp:91:17
#37 0x0000555c9bfcdc58 processBuffer(llvm::raw_ostream&, std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, bool, bool, bool, bool, bool, bool, llvm::function_ref<mlir::LogicalResult (mlir::PassManager&)>, mlir::DialectRegistry&, llvm::ThreadPool*) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Tools/mlir-opt/MlirOptMain.cpp:139:12
#38 0x0000555c9bfcda39 mlir::MlirOptMain(llvm::raw_ostream&, std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, llvm::function_ref<mlir::LogicalResult (mlir::PassManager&)>, mlir::DialectRegistry&, bool, bool, bool, bool, bool, bool, bool)::$_0::operator()(std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, llvm::raw_ostream&) const /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Tools/mlir-opt/MlirOptMain.cpp:181:12
#39 0x0000555c9bfcd946 mlir::LogicalResult llvm::function_ref<mlir::LogicalResult (std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, llvm::raw_ostream&)>::callback_fn<mlir::MlirOptMain(llvm::raw_ostream&, std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, llvm::function_ref<mlir::LogicalResult (mlir::PassManager&)>, mlir::DialectRegistry&, bool, bool, bool, bool, bool, bool, bool)::$_0>(long, std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, llvm::raw_ostream&) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/include/llvm/ADT/STLFunctionalExtras.h:45:12
#40 0x0000555c9d0406f2 llvm::function_ref<mlir::LogicalResult (std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, llvm::raw_ostream&)>::operator()(std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, llvm::raw_ostream&) const /home/user/ScaleHLS-HIDA/polygeist/llvm-project/llvm/include/llvm/ADT/STLFunctionalExtras.h:68:12
#41 0x0000555c9d03fd0d mlir::splitAndProcessBuffer(std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, llvm::function_ref<mlir::LogicalResult (std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, llvm::raw_ostream&)>, llvm::raw_ostream&, bool, bool) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Support/ToolUtilities.cpp:28:12
#42 0x0000555c9bfcc9ca mlir::MlirOptMain(llvm::raw_ostream&, std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, llvm::function_ref<mlir::LogicalResult (mlir::PassManager&)>, mlir::DialectRegistry&, bool, bool, bool, bool, bool, bool, bool) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Tools/mlir-opt/MlirOptMain.cpp:186:10
#43 0x0000555c9bfccb63 mlir::MlirOptMain(llvm::raw_ostream&, std::unique_ptr<llvm::MemoryBuffer, std::default_delete<llvm::MemoryBuffer>>, mlir::PassPipelineCLParser const&, mlir::DialectRegistry&, bool, bool, bool, bool, bool, bool, bool, bool) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Tools/mlir-opt/MlirOptMain.cpp:209:10
#44 0x0000555c9bfcd76f mlir::MlirOptMain(int, char**, llvm::StringRef, mlir::DialectRegistry&, bool) /home/user/ScaleHLS-HIDA/polygeist/llvm-project/mlir/lib/Tools/mlir-opt/MlirOptMain.cpp:306:14
#45 0x0000555c99b82dbc main /home/user/ScaleHLS-HIDA/tools/scalehls-opt/scalehls-opt.cpp:16:23
#46 0x00007f7256229d90 (/lib/x86_64-linux-gnu/libc.so.6+0x29d90)
#47 0x00007f7256229e40 __libc_start_main (/lib/x86_64-linux-gnu/libc.so.6+0x29e40)
#48 0x0000555c99b82c85 _start (/home/user/ScaleHLS-HIDA/build/bin/scalehls-opt+0x1d20c85)
john-kan-2 commented 3 months ago

I got through the whole build (after quite a number of failures), but it now seems that I likely need to use a version different than Python3.11... once I built everything, installed torch-mlir in the venv, etc., I ran in to a segmentation fault after running command:

$ python3 resnet18.py > resnet18.mlir

I was using python3.11 and it seems everything went well until this last step. I'm going to try python3.8 tomorrow. Note I am unaffiliated with scalehls...

Also: for torch-mlir you can only use either python3.8 or python3.11 according to this github comment. I found this to be true (I compiled everything with 3.10 and it ended up complaining when I tried to install torch-mlir with pip).