Open shivaramkrs opened 8 years ago
We don't support Windows right now. You could try it through cygwin and see what happens. In cygwin you could run build.sh manually and see what happens which might give you a clue. If it is something small at that point then maybe we can fix it but if it turns into one issue after another then it becomes less likely all the problems would be fixed in the short term.
I investigated supporting Windows a while ago. The issue I couldn't solve was that ParallelAccelerator couldn't load the shared library file.
I am using cygwin and getting the following error:
Maybe you have an old version of g++ that doesn't support the -std=c++11 option. Can you report what "g++ --version" says? I think you need GCC 4.7 or later. You might later get stuck at the same issue that ehsantn did.
You are right, I had version 4.6x, I updated it to 5.x. Now I get new error
This may be the error ehsantn was struck at...
No, that's not the error that ehsantn was mentioning. Do you have a file libj2carray.so.1.0? If so, then build.sh worked successfully and you can go on to trying to use ParallelAccelerator. If you get some error about ccall not being able to load a library or something to that effect then that is probably ehsantn's error. If you see this problem you'll like see it at driver.jl:321.
Using ParallelAccelerator
works. I am getting -std=c++11 error again when I am trying to run black-sholes.jl.
Maybe I am not setting some path to the right g++ compiler.
If you start a Julia REPL, and do "run(g++ --version
)" then you should be able to see the default g++ version Julia finds. If that isn't the latest one you installed then I'd really suggest tinkering with your paths to make sure the new g++ comes first. If you really get stuck then as a last attempt you can add a full path to g++ on lines 2703 and 2800 of cgen.jl.
I executed run(g++ --version
) in Julia REPL. I changed Env path to make the cgwin g++ (version 5) compiler come first.
When I run the code through the REPL, It starts running and after sometime it exits julia automatically! How do I get a dump of messages before it exits?
I don't think I've ever seen or heard of this behavior before. The way that we debug the system is to set PROSPECT_DEV_MODE=1 in the environment. You can do that in the REPL with ENV["PROSPECT_DEV_MODE"]=1. Then do something like this:
using ParallelAccelerator ParallelAccelerator.set_debug_level(3)
Then try the rest of your program and you should get some messages from ParallelAccelerator. Doing this in the REPL it will be harder to capture the output. You should be able to put this in a Julia file and then run Julia on that and capture to a file. Then, you can attach the log here.
Here is the dump:
points= 10000 domain code = $(Expr(:lambda, Any[:n], Any[Any[Any[symbol("######rest#3905#8266#8304"),Tuple{},0],Any[symbol("GenSym(7)##2"),Int64,18],Any[symbol("GenSym(4)##1"),Int64,18],Any[symbol("##args#8305"),Tuple{Float64,Int64},0],Any[symbol("##dims#8303"),Tuple{Int64},0],Any[symbol("######rest#3905#8266#8302"),Tuple{},0],Any[:y,Array{Float64,1},18],Any[symbol("##dims#8301"),Tuple{Int64},0],Any[:x,Array{Float64,1},18],Any[:n,Int64,0],Any[symbol("##args#8306"),Tuple{Float64,Int64},0]],Any[],Any[Array{Float64,1},Array{Float64,1},BitArray{1},Int64,Int64,Array{Float64,1},Array{Float64,1},Int64,Array{Float64,1},Array{Float64,1},Array{Float64,1},Array{Float64,1},Array{Float64,1},Array{Int64,1},Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64],Any[]], :(begin # C:\Users\shiva.julia\v0.4\ParallelAccelerator\examples\pi\pi.jl, line 49: GenSym(0) = $(Expr(:alloc, Float64, Any[:(n::Int64)])) GenSym(4) = (Base.arraylen)(GenSym(0))::Int64 GenSym(4)##1 = GenSym(4) GenSym(5) = $(Expr(:mmap!, Any[GenSym(0)], ([:(x1::Float64)];) -> (Any[:(((top(rand!))(Base.Random.GLOBAL_RNG,x1,GenSym(4)##1::Int64,Base.Random.CloseOpen)::Float64,))])::Type[Float64])) GenSym(6) = $(Expr(:mmap, Any[GenSym(5)], ([:(x1::Float64)];) -> (Any[:(((top(mul_float))(x1, 2.0)::Float64,))])::Type[Float64])) x = $(Expr(:mmap, Any[GenSym(6)], ([:(x1::Float64)];) -> (Any[:(((top(sub_float))(x1,1.0)::Float64,))])::Type[Float64])) # C:\Users\shiva.julia\v0.4\ParallelAccelerator\examples\pi\pi.jl, line 50: GenSym(1) = $(Expr(:alloc, Float64, Any[:(n::Int64)])) GenSym(7) = (Base.arraylen)(GenSym(1))::Int64 GenSym(7)##2 = GenSym(7) GenSym(8) = $(Expr(:mmap!, Any[GenSym(1)], ([:(x1::Float64)];) -> (Any[:(((top(rand!))(Base.Random.GLOBAL_RNG,x1,GenSym(7)##2::Int64,Base.Random.CloseOpen)::Float64,))])::Type[Float64])) GenSym(9) = $(Expr(:mmap, Any[GenSym(8)], ([:(x1::Float64)];) -> (Any[:(((top(mul_float))(x1,2.0)::Float64,))])::Type[Float64])) y = $(Expr(:mmap, Any[GenSym(9)], ([:(x1::Float64)];) -> (Any[:(((top(sub_float))(x1,1.0)::Float64,))])::Type[Float64])) # C:\Users\shiva.julia\v0.4\ParallelAccelerator\examples\pi\pi.jl, line 51: GenSym(10) = $(Expr(:mmap, Any[:(x::Array{Float64,1})], ([:(x1::Float64)];) -> (Any[:(((top(^))(x1,2)::Float64,))])::Type[Float64])) GenSym(11) = $(Expr(:mmap, Any[:(y::Array{Float64,1})], ([:(x1::Float64)];) -> (Any[:(((top(^))(x1,2)::Float64,))])::Type[Float64])) GenSym(12) = $(Expr(:mmap, Any[GenSym(10),GenSym(11)], ([:(x1::Float64),:(x2::Float64)];) -> (Any[:(((top(add_float))(x1,x2)::Float64,))])::Type[Float64])) GenSym(2) = $(Expr(:mmap, Any[GenSym(12)], ([:(x1::Float64)];) -> (Any[:(((top(lt_float))(x1,1.0)::Bool,))])::Type[Bool])) GenSym(13) = $(Expr(:mmap, Any[GenSym(2)], ([:(x1::Bool)];) -> (Any[:(((top(mul_int))(1,x1)::Int64,))])::Type[Int64])) GenSym(3) = $(Expr(:reduce, 0, GenSym(13), ([:(x1::Int64),:(x2::Int64)];) -> ([:(((top(add_int))(x1,x2)::Int64,))])::Type[Int64])) GenSym(14) = (Core.Intrinsics.sitofp)(Float64,GenSym(3))::Float64 GenSym(15) = (Core.Intrinsics.box)(Float64,GenSym(14))::Float64 GenSym(16) = (Core.Intrinsics.mul_float)(4.0,GenSym(15))::Float64 GenSym(17) = (Core.Intrinsics.sitofp)(Float64,n::Int64)::Float64 GenSym(18) = (Core.Intrinsics.box)(Float64,GenSym(16))::Float64 GenSym(19) = (Core.Intrinsics.box)(Float64,GenSym(17))::Float64 GenSym(20) = (Core.Intrinsics.div_float)(GenSym(18),GenSym(19))::Float64 GenSym(21) = (Core.Intrinsics.box)(Float64,GenSym(20))::Float64 return GenSym(21) end::Float64))) accelerate: DomainIR conversion time = 8.400554217 parallel code = $(Expr(:lambda, Any[:n], Any[Any[Any[:parallel_ir_save_array_len_1_1,Int64,18],Any[symbol("GenSym(7)##2"),Int64,18],Any[:parallel_ir_reduction_output_6,Int64,2],Any[symbol("GenSym(4)##1"),Int64,18],Any[symbol("parallel_ir_temp_GenSym(5)_1"),Float64,18],Any[:parallel_ir_temp_parallel_ir_new_array_name_5_1_1,Int64,50],Any[symbol("parallel_ir_temp_GenSym(13)_1"),Int64,18],Any[symbol("parallel_ir_temp_GenSym(1)_2"),Float64,50],Any[symbol("parallel_ir_temp_GenSym(8)_1"),Float64,18],Any[:parallel_ir_temp_y_1,Float64,18],Any[symbol("parallel_ir_temp_GenSym(1)_1"),Float64,18],Any[:parallel_ir_temp_x_1,Float64,18],Any[symbol("parallel_ir_temp_GenSym(8)_2"),Float64,50],Any[symbol("parallel_ir_temp_GenSym(0)_1"),Float64,18],Any[symbol("parallel_ir_temp_GenSym(5)_2"),Float64,50],Any[:n,Int64,0],Any[:parfor_index_1_1,Int64,18],Any[symbol("parallel_ir_temp_GenSym(0)_2"),Float64,50]],Any[],Any[Int64,Int64,Int64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Bool],Any[]], :(begin GenSym(6) = (Core.Intrinsics.sitofp)(Float64,n::Int64)::Float64 GenSym(8) = (Core.Intrinsics.box)(Float64,GenSym(6))::Float64 GenSym(0) = 0 GenSym(4)##1::Int64 = n GenSym(1) = 0 GenSym(7)##2::Int64 = n parallel_ir_save_array_len_1_1::Int64 = n $(Expr(:parfor,
PIR Body: parallel_ir_temp_GenSym(0)_1::Float64 = 0 parallel_ir_temp_GenSym(0)_2::Float64 = (top(rand!))(Base.Random.GLOBAL_RNG,parallel_ir_temp_GenSym(0)_1::Float64,GenSym(4)##1::Int64,Base.Random.CloseOpen)::Float64 parallel_ir_temp_GenSym(5)_1::Float64 = parallel_ir_temp_GenSym(0)_2::Float64 GenSym(11) = (top(mul_float))(parallel_ir_temp_GenSym(5)_1::Float64,2.0)::Float64 parallel_ir_temp_GenSym(5)_2::Float64 = (top(sub_float))(GenSym(11),1.0)::Float64 parallel_ir_temp_GenSym(1)_1::Float64 = 0 parallel_ir_temp_GenSym(1)_2::Float64 = (top(rand!))(Base.Random.GLOBAL_RNG,parallel_ir_temp_GenSym(1)_1::Float64,GenSym(7)##2::Int64,Base.Random.CloseOpen)::Float64 parallel_ir_temp_GenSym(8)_1::Float64 = parallel_ir_temp_GenSym(1)_2::Float64 GenSym(12) = (top(mul_float))(parallel_ir_temp_GenSym(8)_1::Float64,2.0)::Float64 parallel_ir_temp_GenSym(8)_2::Float64 = (top(sub_float))(GenSym(12),1.0)::Float64 parallel_ir_temp_x_1::Float64 = parallel_ir_temp_GenSym(5)_2::Float64 parallel_ir_temp_y_1::Float64 = parallel_ir_temp_GenSym(8)_2::Float64 GenSym(13) = (top(^))(parallel_ir_temp_y_1::Float64,2)::Float64 GenSym(14) = (top(^))(parallel_ir_temp_x_1::Float64,2)::Float64 GenSym(15) = (top(add_float))(GenSym(14),GenSym(13))::Float64 GenSym(16) = (top(lt_float))(GenSym(15),1.0)::Bool parallel_ir_temp_parallel_ir_new_array_name_5_1_1::Int64 = (top(mul_int))(1,GenSym(16))::Int64 parallel_ir_temp_GenSym(13)_1::Int64 = parallel_ir_temp_parallel_ir_new_array_name_5_1_1::Int64 parallel_ir_reduction_output_6::Int64 = (top(add_int))(parallel_ir_reduction_output_6::Int64,parallel_ir_temp_GenSym(13)_1::Int64)::Int64 Loop Nests: ParallelAccelerator.ParallelIR.PIRLoopNest(:(parfor_index_1_1::Int64),1,:(parallel_ir_save_array_len_1_1::Int64),1) Reductions: ParallelAccelerator.ParallelIR.PIRReduction(:(parallel_ir_reduction_output_6::Int64),0,ParallelAccelerator.ParallelIR.DelayedFunc((anonymous function),Any[Any[:(parallel_ir_reduction_output_6::Int64 = (top(add_int))(parallel_ir_reduction_output_6::Int64,parallel_ir_temp_GenSym(13)_1::Int64)::Int64)],:(parallel_ir_reduction_output_6::Int64),:(parallel_ir_temp_GenSym(13)_1::Int64)])) Poststatements: 0 )) GenSym(2) = parallel_ir_reduction_output_6::Int64 GenSym(3) = (Core.Intrinsics.sitofp)(Float64,GenSym(2))::Float64 GenSym(4) = (Core.Intrinsics.box)(Float64,GenSym(3))::Float64 GenSym(5) = (Core.Intrinsics.mul_float)(4.0,GenSym(4))::Float64 GenSym(7) = (Core.Intrinsics.box)(Float64,GenSym(5))::Float64 GenSym(9) = (Core.Intrinsics.div_float)(GenSym(7),GenSym(8))::Float64 GenSym(10) = (Core.Intrinsics.box)(Float64,GenSym(9))::Float64 return GenSym(10) end::Float64))) accelerate: ParallelIR conversion time = 13.011465882 flattened code = $(Expr(:lambda, Any[:n], Any[Any[Any[:parallel_ir_save_array_len_1_1,Int64,18],Any[symbol("GenSym(7)##2"),Int64,18],Any[:parallel_ir_reduction_output_6,Int64,2],Any[symbol("GenSym(4)##1"),Int64,18],Any[symbol("parallel_ir_temp_GenSym(5)_1"),Float64,18],Any[symbol("parallel_ir_temp_GenSym(13)_1"),Int64,18],Any[:parallel_ir_temp_parallel_ir_new_array_name_5_1_1,Int64,50],Any[symbol("parallel_ir_temp_GenSym(1)_2"),Float64,50],Any[symbol("parallel_ir_temp_GenSym(8)_1"),Float64,18],Any[:parallel_ir_temp_y_1,Float64,18],Any[symbol("parallel_ir_temp_GenSym(1)_1"),Float64,18],Any[:parallel_ir_temp_x_1,Float64,18],Any[symbol("parallel_ir_temp_GenSym(8)_2"),Float64,50],Any[symbol("parallel_ir_temp_GenSym(0)_1"),Float64,18],Any[symbol("parallel_ir_temp_GenSym(5)_2"),Float64,50],Any[:n,Int64,0],Any[:parfor_index_1_1,Int64,18],Any[symbol("parallel_ir_temp_GenSym(0)_2"),Float64,50]],Any[],Any[Int64,Int64,Int64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Bool],Any[]], :(begin GenSym(6) = (Core.Intrinsics.sitofp)(Float64,n::Int64)::Float64 GenSym(8) = (Core.Intrinsics.box)(Float64,GenSym(6))::Float64 GenSym(0) = 0 GenSym(4)##1::Int64 = n GenSym(1) = 0 GenSym(7)##2::Int64 = n parallel_ir_save_array_len_1_1::Int64 = n $(Expr(:parfor_start, ParallelAccelerator.ParallelIR.PIRParForStartEnd([ParallelAccelerator.ParallelIR.PIRLoopNest(:(parfor_index_1_1::Int64),1,:(parallel_ir_save_array_len_1_1::Int64),1)],[ParallelAccelerator.ParallelIR.PIRReduction(:(parallel_ir_reduction_output_6::Int64),0,ParallelAccelerator.ParallelIR.DelayedFunc((anonymous function),Any[Any[:(parallel_ir_reduction_output_6::Int64 = (top(add_int))(parallel_ir_reduction_output_6::Int64,parallel_ir_temp_GenSym(13)_1::Int64)::Int64)],:(parallel_ir_reduction_output_6::Int64),:(parallel_ir_temp_GenSym(13)_1::Int64)]))],nothing,Union{GenSym,Symbol,SymbolNode}[GenSym(11),GenSym(14),symbol("parallel_ir_temp_GenSym(5)_1"),symbol("parallel_ir_temp_GenSym(13)_1"),:parallel_ir_temp_parallel_ir_new_array_name_5_1_1,symbol("parallel_ir_temp_GenSym(1)_2"),symbol("parallel_ir_temp_GenSym(8)_1"),:parallel_ir_temp_y_1,GenSym(16),symbol("parallel_ir_temp_GenSym(1)_1"),:parallel_ir_temp_x_1,GenSym(13),symbol("parallel_ir_temp_GenSym(8)_2"),symbol("parallel_ir_temp_GenSym(0)_1"),symbol("parallel_ir_temp_GenSym(5)_2"),GenSym(15),GenSym(12),symbol("parallel_ir_temp_GenSym(0)_2")]))) parallel_ir_temp_GenSym(0)_1::Float64 = 0 parallel_ir_temp_GenSym(0)_2::Float64 = (top(rand!))(Base.Random.GLOBAL_RNG,parallel_ir_temp_GenSym(0)_1::Float64,GenSym(4)##1::Int64,Base.Random.CloseOpen)::Float64 parallel_ir_temp_GenSym(5)_1::Float64 = parallel_ir_temp_GenSym(0)_2::Float64 GenSym(11) = (top(mul_float))(parallel_ir_temp_GenSym(5)_1::Float64,2.0)::Float64 parallel_ir_temp_GenSym(5)_2::Float64 = (top(sub_float))(GenSym(11),1.0)::Float64 parallel_ir_temp_GenSym(1)_1::Float64 = 0 parallel_ir_temp_GenSym(1)_2::Float64 = (top(rand!))(Base.Random.GLOBAL_RNG,parallel_ir_temp_GenSym(1)_1::Float64,GenSym(7)##2::Int64,Base.Random.CloseOpen)::Float64 parallel_ir_temp_GenSym(8)_1::Float64 = parallel_ir_temp_GenSym(1)_2::Float64 GenSym(12) = (top(mul_float))(parallel_ir_temp_GenSym(8)_1::Float64,2.0)::Float64 parallel_ir_temp_GenSym(8)_2::Float64 = (top(sub_float))(GenSym(12),1.0)::Float64 parallel_ir_temp_x_1::Float64 = parallel_ir_temp_GenSym(5)_2::Float64 parallel_ir_temp_y_1::Float64 = parallel_ir_temp_GenSym(8)_2::Float64 GenSym(13) = (top(^))(parallel_ir_temp_y_1::Float64,2)::Float64 GenSym(14) = (top(^))(parallel_ir_temp_x_1::Float64,2)::Float64 GenSym(15) = (top(add_float))(GenSym(14),GenSym(13))::Float64 GenSym(16) = (top(lt_float))(GenSym(15),1.0)::Bool parallel_ir_temp_parallel_ir_new_array_name_5_1_1::Int64 = (top(mul_int))(1,GenSym(16))::Int64 parallel_ir_temp_GenSym(13)_1::Int64 = parallel_ir_temp_parallel_ir_new_array_name_5_1_1::Int64 parallel_ir_reduction_output_6::Int64 = (top(add_int))(parallel_ir_reduction_output_6::Int64,parallel_ir_temp_GenSym(13)_1::Int64)::Int64 $(Expr(:parfor_end, ParallelAccelerator.ParallelIR.PIRParForStartEnd([ParallelAccelerator.ParallelIR.PIRLoopNest(:(parfor_index_1_1::Int64),1,:(parallel_ir_save_array_len_1_1::Int64),1)],[ParallelAccelerator.ParallelIR.PIRReduction(:(parallel_ir_reduction_output_6::Int64),0,ParallelAccelerator.ParallelIR.DelayedFunc((anonymous function),Any[Any[:(parallel_ir_reduction_output_6::Int64 = (top(add_int))(parallel_ir_reduction_output_6::Int64,parallel_ir_temp_GenSym(13)_1::Int64)::Int64)],:(parallel_ir_reduction_output_6::Int64),:(parallel_ir_temp_GenSym(13)_1::Int64)]))],nothing,Union{GenSym,Symbol,SymbolNode}[GenSym(11),GenSym(14),symbol("parallel_ir_temp_GenSym(5)_1"),symbol("parallel_ir_temp_GenSym(13)_1"),:parallel_ir_temp_parallel_ir_new_array_name_5_1_1,symbol("parallel_ir_temp_GenSym(1)_2"),symbol("parallel_ir_temp_GenSym(8)_1"),:parallel_ir_temp_y_1,GenSym(16),symbol("parallel_ir_temp_GenSym(1)_1"),:parallel_ir_temp_x_1,GenSym(13),symbol("parallel_ir_temp_GenSym(8)_2"),symbol("parallel_ir_temp_GenSym(0)_1"),symbol("parallel_ir_temp_GenSym(5)_2"),GenSym(15),GenSym(12),symbol("parallel_ir_temp_GenSym(0)_2")]))) GenSym(2) = parallel_ir_reduction_output_6::Int64 GenSym(3) = (Core.Intrinsics.sitofp)(Float64,GenSym(2))::Float64 GenSym(4) = (Core.Intrinsics.box)(Float64,GenSym(3))::Float64 GenSym(5) = (Core.Intrinsics.mul_float)(4.0,GenSym(4))::Float64 GenSym(7) = (Core.Intrinsics.box)(Float64,GenSym(5))::Float64 GenSym(9) = (Core.Intrinsics.div_float)(GenSym(7),GenSym(8))::Float64 GenSym(10) = (Core.Intrinsics.box)(Float64,GenSym(9))::Float64 return GenSym(10) end::Float64))) array_types_in_sig from signature = Dict{DataType,Int64}() array_types_in_sig including returns = Dict{DataType,Int64}() ParallelAccelerator.accelerate for _ppcalcPip7907_j2c_proxy C File = C:\Users\shiva.julia\v0.4\ParallelAccelerator\src../deps/generated/cgen_output0.cpp dyn_lib = C:\Users\shiva.julia\v0.4\ParallelAccelerator\src../deps/generated/libcgen_output0.so.1.0 convert_to_ccall_typ typ = Int64 typeof(typ) = DataType convert_to_ccall_typ typ = Int64 typeof(typ) = DataType new_tuple.args = Any[Int64] sig_ndims = Any[0] modified_sig = (Int64,) sig_dims = Any[0] len? 11 signature = (Int64,) -> [(Float64,false)] modified_args = Array{Any,1} Any[symbol("##n#12624")] extra_sig = Type[Ptr{Float64}] ret_arg_exps = Any[:((top(pointer))((top(arrayref))(ret_args::Array{Any,1},1)))] tuple_sig_expr = (Int32,Int64,Ptr{Float64}) accelerate: accelerate conversion time = 3.614398157
Please submit a bug report with steps to reproduce this fault, and any error messages that follow (in their entirety). Thanks. Exception: EXCEPTION_ACCESS_VIOLATION at 0x64f2539e -- jl_egal at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_egal at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) unknown function (ip: 0000000064F047D8) jl_apply_type at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_inst_concrete_tupletype_v at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_init_types at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_apply_generic at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) typeinf_uncached at inference.jl:1662 jlcall_typeinf_uncached_151 at (unknown line) jl_apply_generic at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) typeinf at inference.jl:1339 jlcall_typeinf_147 at (unknown line) jl_apply_generic at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) typeinf_ext at inference.jl:1283 jl_apply_generic at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_method_cache_insert at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_method_cache_insert at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_apply_generic at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) toq at util.jl:80 main at C:\Users\shiva.julia\v0.4\ParallelAccelerator\examples\pi\pi.jl:79 jlcall_main_2448 at (unknown line) jl_apply_generic at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_interpret_toplevel_expr at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_interpret_toplevel_thunk_with at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_eval_with_compiler_p at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_parse_eval_all at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_load_file_string at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) include_string at loading.jl:266 jl_apply_generic at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_interpret_toplevel_expr at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_interpret_toplevel_thunk_with at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_eval_with_compiler_p at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) jl_f_tuple at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) include_string at C:\Users\shiva.julia\v0.4\CodeTools\src\eval.jl:32 jl_apply_generic at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) anonymous at C:\Users\shiva.julia\v0.4\Atom\src\eval.jl:84 withpath at C:\Users\shiva.julia\v0.4\Requires\src\require.jl:37 jl_apply_generic at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) withpath at C:\Users\shiva.julia\v0.4\Atom\src\eval.jl:53 jl_apply_generic at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line) anonymous at C:\Users\shiva.julia\v0.4\Atom\src\eval.jl:83 jl_unprotect_stack at C:\Users\shiva\AppData\Local\Julia-0.4.3\bin\libjulia.dll (unknown line)
Please submit a bug report with steps to reproduce this fault, and any error messages that follow (in their entirety). Thanks. Exception: EXCEPTION_ACCESS_VIOLATION at 0x7ffa853c729b -- RtlVirtualUnwind at C:\WINDOWS\SYSTEM32\ntdll.dll (unknown line)
Please submit a bug report with steps to reproduce this fault, and any error messages that follow (in their entirety). Thanks. Exception: EXCEPTION_ACCESS_VIOLATION at 0x7ffa853c729b -- RtlVirtualUnwind at C:\WINDOWS\SYSTEM32\ntdll.dll (unknown line)
Please submit a bug report with steps to reproduce this fault, and any error messages that follow (in their entirety). Thanks. Exception: EXCEPTION_ACCESS_VIOLATION at 0x7ffa853c729b -- RtlVirtualUnwind at C:\WINDOWS\SYSTEM32\ntdll.dll (unknown line) Julia has stopped: 3221225477, null
I think you may be at the same point ehsantn mentioned.
I tried a cpp file with:
extern "C" int f1(int a, int b) { printf("a = %d, b = %d\n", a, b); return 7; }
Then:
g++ -O3 -fopenmp -std=c++11 -g -fpic -c -o f1.o f1.cpp g++ -g -shared -fopenmp -std=c++11 -o f1.dll f1.o -lm
Then in the Julia REPL: ret = ccall(("f1","f1.dll"),Cint, (Cint, Cint), 3, 4)
This doesn't work: ERROR: error compiling anonymous: could not load library "f1.dll" The specified module could not be found.
Then I found issue #6260. So, I ran dependency walker on f1.dll and there were 4 modules that weren't found. CYGGCC_S-1.DLL CYGGOMP-1.DLL CYGSTDC++-6.DLL CYGWIN1.DLL
These files exist in the cygwin/bin directory. I added c:\cygwin\bin to my user-specific path variable and re-tried dependency walker and all the modules were then found. However, it still fails in Julia.
I'll ask around about this issue. If you tinker and find a solution to this simple example let me know and we can apply it to the real system.
The problem with my previous example is that you can't compile the DLL with the regular cygwin g++ toolchain, which produces a dependency in the DLL on cygwin. You have to compile the DLL with the mingw version of g++ and then the previous example works.
I installed 64-bit mingw compiler through cygwin setup and here's the command.
/bin/x86_64-w64-mingw32-g++ -g -shared -std=c++11 -o f1.dll -lm f1.cpp
I don't have time right now to check if this works with the whole system. You could install mingw and put an alias from g++ to the above executable name in some new directory and then put that at the head of your path and then try the system again. An alternative is again to change the name of g++ in cgen.jl to point to this new location.
To potentially fix this problem correctly in the real system then maybe we do a @windows_only section where we specify x86_64-w64-mingw32-g++ as the compiler name and that way it would fail if the correct mingw wasn't installed.
A pull request with such a fix would certainly be appreciated.
Should use i686-w64-mingw32 for 32 bit.
The WinRPM package would allow automating the toolchain installation to avoid manual setup steps or having to install cygwin. If you tried that, what issues did it have?
The WinRPM package would allow automating the toolchain installation to avoid manual setup steps or having to install cygwin. If you tried that, what issues did it have?
I am not sure if I am right, but I think WinRPM installs g++ 4.63 (that's the one present by default). Parallel Accelerator needs 4.7+
I get a lot of deprecation warnings on "using WinRPM". Then, I do WinRPM.update() and get:
INFO: Downloading https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win32/openSUSE_13.1/repodata/repomd.xml WARNING: Base.Uint16 is deprecated, use UInt16 instead. WARNING: Base.Uint16 is deprecated, use UInt16 instead. WARNING: Base.Uint8 is deprecated, use UInt8 instead. WARNING: Base.Uint8 is deprecated, use UInt8 instead. WARNING: Unknown download failure, error code: 2148270105 WARNING: Retry 1/5 downloading: https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win32/openSUSE_13.1/repodata/repomd.xml WARNING: Unknown download failure, error code: 2148270105 WARNING: Retry 2/5 downloading: https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win32/openSUSE_13.1/repodata/repomd.xml WARNING: Unknown download failure, error code: 2148270105 WARNING: Retry 3/5 downloading: https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win32/openSUSE_13.1/repodata/repomd.xml WARNING: Unknown download failure, error code: 2148270105 WARNING: Retry 4/5 downloading: https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win32/openSUSE_13.1/repodata/repomd.xml WARNING: Unknown download failure, error code: 2148270105 WARNING: Retry 5/5 downloading: https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win32/openSUSE_13.1/repodata/repomd.xml WARNING: received error 0 while downloading https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win32/openSUSE_13.1/repodata/repomd.xml INFO: Downloading https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win64/openSUSE_13.1/repodata/repomd.xml WARNING: Unknown download failure, error code: 2148270105 WARNING: Retry 1/5 downloading: https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win64/openSUSE_13.1/repodata/repomd.xml WARNING: Unknown download failure, error code: 2148270105 WARNING: Retry 2/5 downloading: https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win64/openSUSE_13.1/repodata/repomd.xml WARNING: Unknown download failure, error code: 2148270105 WARNING: Retry 3/5 downloading: https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win64/openSUSE_13.1/repodata/repomd.xml WARNING: Unknown download failure, error code: 2148270105 WARNING: Retry 4/5 downloading: https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win64/openSUSE_13.1/repodata/repomd.xml WARNING: Unknown download failure, error code: 2148270105 WARNING: Retry 5/5 downloading: https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win64/openSUSE_13.1/repodata/repomd.xml WARNING: received error 0 while downloading https://cache.e.ip.saba.us/http://download.opensuse.org/repositories/windows:/mingw:/win64/openSUSE_13.1/repodata/repomd.xml
Then trying a search for mingw in WinRPM I get the following:
julia> WinRPM.search("mingw") WinRPM Package Set:
WinRPM fails to download xml file behind the firewall. I get this problem in my office. I build it in another machine and then copy the folder.
You're using an outdated version of WinRPM, are your packages up to date? The gcc version on WinRPM is 5.3.0, you should be able to add it via WinRPM.add("gcc")
though firewalls can cause some issues.
I just ran into this issue on Windows 10. Then I used WinRPM to install the gcc-c++ package.
julia> import WinRPM
julia> WinRPM.install("gcc-c++";yes=true)
INFO: Packages to install: gcc-c++
INFO: Downloading: gcc-c++
INFO: Extracting: gcc-c++
INFO: Complete
julia> WinRPM.select(WinRPM.lookup("gcc-c++"),"gcc-c++")
WinRPM Package:
Name: gcc-c++
Summary: MinGW Windows compiler for C++
Version: 6.1.0 (rel 4.1)
Arch: mingw64
URL: http://www.mingw.org/
License: GPL-2.0+
Description: MinGW Windows compiler for C++
julia> gpp=Pkg.dir("WinRPM","deps","usr","x86_64-w64-mingw32","sys-root","mingw","bin","g++");
julia> run(`$gpp --version`)
g++ (GCC) 6.1.0
Copyright (C) 2016 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
So, it seems WinRPM is providing g++ version 6.1.0. Is there a way to automate the compilation process on Windows by doing everything inside build.jl instead of build.sh? Or otherwise provide instructions on how to compile manually on Windows? Thanks!
For anyone stumbling upon this: I have managed to install ParallelAccelerator on Windows 10. The basic steps are these:
Pkg.test("ParallelAccelerator") will issue a warning that matrix operations will be slow because it cannot find a BLAS installation. I haven't yet managed to fix this.
If desired I could prepare a PR to add this to the readme.
Rewriting build.sh
into a build.jl
and using the automatic WinRPM installation of gcc would be a better and more automatic short-term solution, though maybe before too much longer the need for a compiler here will be eliminated. Not sure how far away that is.
I used @s-broda's method and was able to build under Windows 10. (I installed gcc with pacman). However, when I go to test the package, in addition to the BLAS not found errors I see at every test something like the following:
Does anyone know what might be going on here?
@s-broda I'd welcome an addition to our documentation on how to get ParallelAccelerator running on Windows, although I suggest adding it to our docs rather than the README.
Regarding BLAS, these are warnings and not errors. Some things will be slower without a BLAS library. Our docs could probably stand to have a better explanation of what a BLAS library is and why it is that ParallelAccelerator wants one. It might also be good to provide a way to suppress the BLAS-related warnings.
@amellnik Do you mean you're seeing these errors when you run Pkg.test("ParallelAccelerator")
, or something else?
@lkuper This is when running Pkg.test("ParallelAccelerator")
on 0.4.5. Looking more closely at the error log I think I missed the more important segment:
It looks like the paths to the generated files are being correctly generated here but are being shelled out without the escapes being removed properly somewhere around here. All the back-tick command constructions seem correct there, but I'll keep looking.
On 0.5.0 master the tests die immediately with
julia> Pkg.test("ParallelAccelerator")
INFO: Testing ParallelAccelerator
WARNING: Base.LambdaStaticData is deprecated, use LambdaInfo instead.
likely near C:\Users\Alex\.julia\v0.5\CompilerTools\src\ast_walk.jl:480
WARNING: Base.LambdaStaticData is deprecated, use LambdaInfo instead.
likely near C:\Users\Alex\.julia\v0.5\CompilerTools\src\ast_walk.jl:481
ERROR: LoadError: LoadError: LoadError: LoadError: UndefVarError: GenSym not defined
in include_from_node1(::String) at .\loading.jl:426 (repeats 2 times)
in eval(::Module, ::Any) at .\boot.jl:234
in require(::Symbol) at .\loading.jl:357
in include_from_node1(::String) at .\loading.jl:426
in eval(::Module, ::Any) at .\boot.jl:234
in require(::Symbol) at .\loading.jl:357
in include_from_node1(::String) at .\loading.jl:426
in process_options(::Base.JLOptions) at .\client.jl:266
in _start() at .\client.jl:322
while loading C:\Users\Alex\.julia\v0.5\CompilerTools\src\ast_walk.jl, in expression starting on line 423
while loading C:\Users\Alex\.julia\v0.5\CompilerTools\src\CompilerTools.jl, in expression starting on line 32
while loading C:\Users\Alex\.julia\v0.5\ParallelAccelerator\src\ParallelAccelerator.jl, in expression starting on line 30
while loading C:\Users\Alex\.julia\v0.5\ParallelAccelerator\test\runtests.jl, in expression starting on line 26
=========================[ ERROR: ParallelAccelerator ]=========================
failed process: Process(`'C:\Users\Alex\AppData\Local\Julia-0.5.0-dev\bin\julia' -Cx86-64 '-JC:\Users\Alex\AppData\Local\Julia-0.5.0-dev\lib\julia\sys.dll' --compile=yes --depwarn=yes --check-bounds=yes --code-coverage=none --color=yes 'C:\Users\Alex\.julia\v0.5\ParallelAccelerator\test\runtests.jl'`, ProcessExited(1)) [1]
================================================================================
ERROR: ParallelAccelerator had test errors
in #test#51(::Bool, ::Function, ::Array{AbstractString,1}) at .\pkg\entry.jl:720
in (::Base.Pkg.Entry.#kw##test)(::Array{Any,1}, ::Base.Pkg.Entry.#test, ::Array{AbstractString,1}) at .\<missing>:0
in (::Base.Pkg.Dir.##2#3{Array{Any,1},Base.Pkg.Entry.#test,Tuple{Array{AbstractString,1}}})() at .\pkg\dir.jl:31
in cd(::Base.Pkg.Dir.##2#3{Array{Any,1},Base.Pkg.Entry.#test,Tuple{Array{AbstractString,1}}}, ::String) at .\file.jl:48
in #cd#1(::Array{Any,1}, ::Function, ::Function, ::Array{AbstractString,1}, ::Vararg{Array{AbstractString,1},N}) at .\pkg\dir.jl:31
in (::Base.Pkg.Dir.#kw##cd)(::Array{Any,1}, ::Base.Pkg.Dir.#cd, ::Function, ::Array{AbstractString,1}, ::Vararg{Array{AbstractString,1},N}) at .\<missing>:0
in #test#3(::Bool, ::Function, ::String, ::Vararg{String,N}) at .\pkg\pkg.jl:255
in test(::String, ::Vararg{String,N}) at .\pkg\pkg.jl:255
in eval(::Module, ::Any) at .\boot.jl:234
in macro expansion at .\REPL.jl:92 [inlined]
in (::Base.REPL.##1#2{Base.REPL.REPLBackend})() at .\event.jl:46
Edit: I realized that I was looking at the last tagged release of ParallelAccelerator. On master tests die immediately with:
@amellnik Yeah, for the moment ParallelAccelerator will only work on 0.4.x versions of Julia, although keep an eye out for 0.5 compatibility once a Julia 0.5 release candidate is available. In the error message you posted, I think the multiple backslashes are a red herring. I'm suspicious of "libgomp.spec: No such file or directory", though. I would suggest searching on that error message and mingw.
@lkuper OK, I've submitted a PR.
@tkelman I agree that this is not a very elegant solution, but I'm afraid I'm not very good with bash scripts. Perhaps someone else could tackle this.
This issue could probably be closed.
I would recommend against using any version of gcc other than the exact version used to compile Julia itself (which is a cygwin cross compile) or the WinRPM packages (which are cross compiled from opensuse).
@s-broda Could your approach still work if using the Julia WinRPM approach to get gcc, as recommended upthread?
@tkelman Why is it important to use the exact same version of gcc that was used to compile Julia?
There are many different variations on how to configure gcc in terms of exception handling, threading, and many other details. When you pick some random gcc build then there's no expectation of maintaining ABI compatibility for trying to load libraries built by the different compilers into the same process, especially if C++ or threading is involved. It's pretty similar to trying to build a library on one linux distribution and use it on a different distro - it can work if you're very careful and follow some strict constraints, but if not it's going to be error prone.
Minor update: on a different system running Windows 8.1 with 0.4.5 and the current master I'm also seeing the same error at using ParallelAccelerator
or the start of tests:
WARNING: could not import Helper.isfunctionhead into DomainIR
LoadError: LoadError: LoadError: UndefVarError: RHSVar not defined
while loading C:\Users\amellnik\.julia\v0.4\ParallelAccelerator\src\domain-ir.jl, in expression starting on line 313
while loading C:\Users\amellnik\.julia\v0.4\ParallelAccelerator\src\ParallelAccelerator.jl, in expression starting on line 250
while loading In[5], in expression starting on line 1
in include at boot.jl:261
in include_from_node1 at loading.jl:320
in include at boot.jl:261
in include_from_node1 at loading.jl:320
in require at loading.jl:259
WARNING: could not import LambdaHandling.lambdaToLambdaVarInfo into DomainIR
I'm currently trying to work out where RHSVar
should be defined.
@amellnik RHSVar
is exported by the CompilerTools package; are you sure you have that up to date?
@lkuper I had 0.1.5 as a required package, but checking out the master of CompilerTools got me to back to a long list of libgomp.spec: No such file or directory
errors as above. I then needed to install the optional package "openmp" for TDM-GCC, and now tests are failing with
ERROR: LoadError: LoadError: test error in expression: APILibTest.test1()
error compiling j2c_array_delete: could not load library "C:\Users\amellnik\.jul
ia\v0.4\ParallelAccelerator\src\../deps/libj2carray.so.1.0"
The specified module could not be found.
I do have this file at C:\Users\amellnik\.julia\v0.4\ParallelAccelerator\deps
and this path does resolve correctly.
At this point, I'm not sure I can be much help with resolving these, so I'll wait until there's a build.jl
update and try that.
@amellnik Make sure to re-run Pkg.build("ParallelAccelerator")
if you haven't done so after updating CompilerTools. Other than that, if you resolved the libgomp.spec issue, I'm not sure what's wrong.
@tkelman OK, now that I think about it, using a ParallelAccelerator built on mingw with a Julia binary built for native Windows could present problems. @s-broda, I'm kind of surprised it actually worked...
It does seem like the right way to do Windows compatibility might be to bite the bullet and move everything that build.sh is doing into build.jl to get rid of the bash dependency, build a libj2carray.dll with a native Windows C++ compiler, and have ParallelAccelerator compile a .dll too. Windows development is mostly a mystery to me, though, so I'm not sure.
exported by the CompilerTools package; are you sure you have that up to date?
Sounds like ParallelAccelerator should be more specific about version lower bounds in REQUIRE
.
built on mingw with a Julia binary built for native Windows could present problems
Julia binaries for Windows are built with mingw-w64, which is a native compiler. I've been saying that you should be careful about which build of mingw-w64 you use. Using a different compiler would be even more error prone.
I see. I was confusing mingw with cygwin. Like I said, Windows dev is mostly a mystery to me. :)
We could be pegging each ParallelAccelerator release to a particular CompilerTools release, yeah. But how to express the requirement that if you're using ParallelAccelerator master, you should also have CompilerTools master?
I think you should be tagging new releases more often rather than encouraging users to run on master.
I've been working on trying to solve this the right way today. In build.jl I'm install WinRPM if necessary and then installing gcc through WinRPM. I first got the missing libgmp error that others have seen and solved by adding the mingw bin dir to ENV["PATH"] first. Now, I'm getting the following. Has anyone seen this before?
julia> run(`$gpp -g -shared -std=c++11 -o libj2carray.dll -lm j2c-array.cpp`)
In file included from c:\users\taanders\.julia\v0.5\winrpm\deps\usr\x86_64-w64-mingw32\sys-root\mingw\lib\gcc\x86_64-w64-mingw32\6.1.0\include\c++\cstdlib:75:0,
from c:\users\taanders\.julia\v0.5\winrpm\deps\usr\x86_64-w64-m
ingw32\sys-root\mingw\lib\gcc\x86_64-w64-mingw32\6.1.0\include\c++\stdlib.h:36,
from include/j2c-array.h:31,
from j2c-array.cpp:26:
c:\users\taanders\.julia\v0.5\winrpm\deps\usr\x86_64-w64-mingw32\sys-root\mingw\lib\gcc\x86_64-w64-mingw32\6.1.0\include\c++\stdlib.h:30:26: fatal error: stdlib.h: No such file or directory
# include_next <stdlib.h>
^
compilation terminated.
Yeah, the include directories are a little funny, either add a -I
to usr\x86_64-w64-mingw32\sys-root\mingw\include
, or make a link or copy from there to usr\x86_64-w64-mingw32\sys-root\mingw\x86_64-w64-mingw32\include
@lkuper You're right; it didn't actually work. I was still using the dll I had built with TDM GCC earlier. My bad. I'm going to close the PR. @DrTodd13 's solution is much more elegant anyway.
@tkelman I have no directory usr\x86_64-w64-mingw32\sys-root\mingw\include. In mingw, I have:
$ ll ../../WinRPM/deps/usr/x86_64-w64-mingw32/sys-root/mingw/ total 12 drwxrwx---+ 1 Administrators Domain Users 0 Jul 6 20:17 . drwxrwx---+ 1 Administrators Domain Users 0 Jul 6 20:11 .. drwxrwx---+ 1 Administrators Domain Users 0 Jul 6 20:18 bin drwxrwx---+ 1 Administrators Domain Users 0 Jul 6 20:17 etc drwxrwx---+ 1 Administrators Domain Users 0 Jul 6 20:18 lib drwxrwx---+ 1 Administrators Domain Users 0 Jul 6 20:11 libexec drwxrwx---+ 1 Administrators Domain Users 0 Jul 6 20:18 share
Odd. Did you do WinRPM.install("gcc-c++")
?
@tkelman Yes, I originally did it with WinRPM.install("gcc-c++",yes=true) but now I can say WinRPM.install("gcc-c++") and it says "INFO: Nothing to do".
Looks like they may have moved the includes around and now they might be in %{_mingw64_libdir}/gcc/%{_mingw64_target}/%{version}/include
(ref https://build.opensuse.org/package/view_file/windows:mingw:win64/mingw64-gcc/mingw64-gcc.spec?expand=1)
Go back and look at error I reported. The problem doesn't look like it isn't finding the %{_mingw64_libdir}/gcc/%{_mingw64_target}/%{version}/include directory because that is actually where the error is being reported...from stdlib.h in that directory. It is this "#include_next stdlib.h" from the stdlib.h in the mingw include dir that seems to be the problem. @tkelman
What about mingw64-headers
? That's supposed to be a dependency, but maybe it didn't get auto-installed for some reason?
Oh, gcc
evidently isn't a dependency of gcc-c++
. If you also run WinRPM.install("gcc", yes=true)
you'll get more headers.
@tkelman Thanks. WinRPM.install("headers") and then adding -I to the directory the headers are installed in got me past this problem. Now the error is as follows. If I run the command from a command-line in the mingw\bin\g++ directory then it works but if I run it from the directory where the files to be compiled are then it returns 1, presumably this CreateProcess problem.
In the build.jl script, I tried cd'ing to the mingw\bin and then running the command but that still gives the CreateProcess error. At the bottom is the relevant section from build.jl. I try to set DL_LOAD_PATH and PATH as well.
g++: error: CreateProcess: No such file or directory
ERROR: LoadError: failed process: Process(`'C:\Users\taanders\.julia\v0.5\WinRPM\deps\usr\x86_64-w64-mingw32\sys-root\mingw\bin\g++' -g -shared -std=c++11 -I 'C:\Users\taanders\.julia\v0.5\WinRPM\deps\usr\x86_64-w64-mingw32\sys-root\mingw\include' -o 'C:\Users\taanders\.julia\v0.5\ParallelAccelerator\deps\libj2carray.dll' -lm 'C:\Users\taanders\.julia\v0.5\ParallelAccelerator\deps\j2c-array.cpp'`,
ProcessExited(1)) [1]
in pipeline_error(::Base.Process) at .\process.jl:612
in run(::Cmd) at .\process.jl:588
in include_from_node1(::String) at .\loading.jl:426
in eval(::Module, ::Any) at .\boot.jl:234
in macro expansion at .\REPL.jl:92 [inlined]
in (::Base.REPL.##1#2{Base.REPL.REPLBackend})() at .\event.jl:46
while loading C:\Users\taanders\.julia\v0.5\ParallelAccelerator\deps\build.jl, in expression starting on line 73
installed_packages = Pkg.installed()
println("Installing ParallelAccelerator for Windows.")
builddir = dirname(Base.source_path())
println("Build directory is ", builddir)
orig_dir = pwd()
if !haskey(installed_packages, "WinRPM")
println("WinRPM is not currently installed so installing now.")
Pkg.add("WinRPM")
end
import WinRPM
println("Installing gcc-c++.")
WinRPM.install("gcc-c++";yes=true)
WinRPM.install("headers";yes=true)
gpp = Pkg.dir("WinRPM","deps","usr","x86_64-w64-mingw32","sys-root","mingw","bin","g++")
RPMbindir = Pkg.dir("WinRPM","deps","usr","x86_64-w64-mingw32","sys-root","mingw","bin")
incdir = Pkg.dir("WinRPM","deps","usr","x86_64-w64-mingw32","sys-root","mingw","include")
push!(Base.Libdl.DL_LOAD_PATH,RPMbindir)
ENV["PATH"]=ENV["PATH"]*";"*RPMbindir
cd(RPMbindir)
println("Installed gcc is version ")
run(`$gpp --version`)
run(`$gpp -g -shared -std=c++11 -I $incdir -o $builddir\\libj2carray.dll -lm $builddir\\j2c-array.cpp`)
cd(orig_dir)
What's it trying to run? Maybe -v
would give more details?
Installing gcc as well seemed to fix the CreateProcess problem. I updated the repo and build.jl will now successfully build libj2carray.dll. I am going to interpret this thread as Windows support in general and not just install time errors. Build.sh puts some variables into a config file and while most of those aren't relevant on a Windows install there may be a couple that are and I don't deal with those yet in build.jl. However, I'll move onto runtime correctness now. Thanks @tkelman !
I get this build error while installing ParallelAccelarator.