dfdx / Spark.jl

Julia binding for Apache Spark
Other
204 stars 39 forks source link

Nothing works #106

Closed JuanVargas closed 2 years ago

JuanVargas commented 2 years ago

I tried to use this library but nothing works. Please remove from GitHub.

dfdx commented 2 years ago

I see several similar comments from you in other repositories with little to no description of the issue and no visible effort to resolve it. This is not a valid problem definition, so I close it. If you actually tried the project and encountered a specific error, feel free to open a new issue, adding the error message and steps to reproduce. Any new posts without this information will be immediately closed without a further notice.

JuanVargas commented 2 years ago

When I wrote "nothing works.." I was not trying to be offensive. Sorry if you feel offended. Since you wanted details:

OS = Ubuntu 22.04 LTS; Julia 1.7.3 java -version openjdk version "11.0.15" 2022-04-19 OpenJDK Runtime Environment (build 11.0.15+10-Ubuntu-0ubuntu0.22.04.1) OpenJDK 64-Bit Server VM (build 11.0.15+10-Ubuntu-0ubuntu0.22.04.1, mixed mode, sharing)

mvn -version Apache Maven 3.6.3 Maven home: /usr/share/maven Java version: 11.0.15, vendor: Private Build, runtime: /usr/lib/jvm/java-11-openjdk-amd64 Default locale: en_US, platform encoding: UTF-8 OS name: "linux", version: "5.15.0-41-generic", arch: "amd64", family: "unix"

When I run Julia and try to follow the first 3 lines in your documentation, this is what I get:

using Pkg; Pkg.add("Spark.jl") ERROR: Spark.jl is not a valid package name. Perhaps you meant Spark Stacktrace: [1] pkgerror(msg::String) @ Pkg.Types ~/Downloads/julia-1.7.3/share/julia/stdlib/v1.7/Pkg/src/Types.jl:68 [2] check_package_name @ ~/Downloads/julia-1.7.3/share/julia/stdlib/v1.7/Pkg/src/API.jl:128 [inlined] [3] add(ctx::Pkg.Types.Context, pkgs::Vector{Pkg.Types.PackageSpec}; preserve::Pkg.Types.PreserveLevel, platform::Base.BinaryPlatforms.Platform, kwargs::Base.Pairs{Symbol, Base.TTY, Tuple{Symbol}, NamedTuple{(:io,), Tuple{Base.TTY}}}) @ Pkg.API ~/Downloads/julia-1.7.3/share/julia/stdlib/v1.7/Pkg/src/API.jl:225 [4] add(pkgs::Vector{Pkg.Types.PackageSpec}; io::Base.TTY, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}) @ Pkg.API ~/Downloads/julia-1.7.3/share/julia/stdlib/v1.7/Pkg/src/API.jl:149 [5] add(pkgs::Vector{Pkg.Types.PackageSpec}) @ Pkg.API ~/Downloads/julia-1.7.3/share/julia/stdlib/v1.7/Pkg/src/API.jl:144 [6] #add#27 @ ~/Downloads/julia-1.7.3/share/julia/stdlib/v1.7/Pkg/src/API.jl:142 [inlined] [7] add @ ~/Downloads/julia-1.7.3/share/julia/stdlib/v1.7/Pkg/src/API.jl:142 [inlined] [8] #add#26 @ ~/Downloads/julia-1.7.3/share/julia/stdlib/v1.7/Pkg/src/API.jl:141 [inlined] [9] add(pkg::String) @ Pkg.API ~/Downloads/julia-1.7.3/share/julia/stdlib/v1.7/Pkg/src/API.jl:141 [10] top-level scope @ REPL[1]:1

julia> Pkg.add("Spark") Updating registry at ~/.julia/registries/General.toml Resolving package versions... No Changes to ~/.julia/environments/v1.7/Project.toml No Changes to ~/.julia/environments/v1.7/Manifest.toml

julia> using Spark ERROR: InitError: MethodError: no method matching read(::String, ::Type{String}) You may have intended to import Base.read Stacktrace: [1] load_spark_defaults(d::Dict{Any, Any}) @ Spark ~/.julia/packages/Spark/0luxD/src/init.jl:86 [2] init(; log_level::String) @ Spark ~/.julia/packages/Spark/0luxD/src/init.jl:22 [3] init @ ~/.julia/packages/Spark/0luxD/src/init.jl:17 [inlined] [4] init() @ Spark ~/.julia/packages/Spark/0luxD/src/core.jl:30 [5] _include_from_serialized(path::String, depmods::Vector{Any}) @ Base ./loading.jl:768 [6] _require_search_from_serialized(pkg::Base.PkgId, sourcepath::String) @ Base ./loading.jl:854 [7] _require(pkg::Base.PkgId) @ Base ./loading.jl:1097 [8] require(uuidkey::Base.PkgId) @ Base ./loading.jl:1013 [9] require(into::Module, mod::Symbol) @ Base ./loading.jl:997 during initialization of module Spark

Given that not even the first 3 lines of your documentation work, I must assume that nothing else works either.

JuanVargas commented 2 years ago

furthermore

Spark.init()

ERROR: MethodError: no method matching read(::String, ::Type{String}) You may have intended to import Base.read Stacktrace: [1] load_spark_defaults(d::Dict{Any, Any}) @ Spark ~/.julia/packages/Spark/0luxD/src/init.jl:86 [2] init(; log_level::String) @ Spark ~/.julia/packages/Spark/0luxD/src/init.jl:22 [3] init() @ Spark ~/.julia/packages/Spark/0luxD/src/init.jl:17 [4] top-level scope @ REPL[9]:1

dfdx commented 2 years ago

This is a much better bug report, thank you. Pkg.add("Spark.jl") is certainly a typo in the docs. Base.read() is indeed shadowed by later definition of Spark.read(::SparkSession), but I'm not sure why you encounter it when neither other users, nor CI/CD hit it. Anyway, both issues are now fixed in the fix-init-and-docs branch.

Note that this repository has ~2000 lines of code in ~300 commits written by 14 people over 7 years. It's reasonable to think that something still doesn't work, but it definitely doesn't look like nothing works. Analyzing the report, fixing the issues and running tests on CI/CD took less than 30 minutes after the details have been provided.