cdarlint / winutils

winutils.exe hadoop.dll and hdfs.dll binaries for hadoop windows
1.82k stars 2.1k forks source link

This version of %1 is not compatible with the version of Windows you're running #20

Open molotch opened 3 years ago

molotch commented 3 years ago

Windows 10 Pro 10.0.18363 Build 18363 AdoptOpenJDK build 1.8.0_282-b08 Spark 2.4.5 Scala 2.12.13

I'm getting this error trying to save a dataframe in Spark parquet locally on my Windows 10 computer. Anyone got any ideas on how to fix this or why it throws an exception? I've tried changing java version, spark version, scala version and winutils.exe version with the same result. Seems to be a Windows issue.

"java.io.IOException: Cannot run program "C:\hadoop\bin\winutils.exe": CreateProcess error=216, This version of %1 is not compatible with the version of Windows you're running. Check your computer's system information and then contact the software publisher"

geekalyssa commented 3 years ago

I am also facing this issue for Windows 10. How did you resolved?

molotch commented 3 years ago

I used the 2.7.1 version from Steve Loughrans repo. I had used that before so I just copied it from another project I have. Though you might be lucky trying the 2.7.3 version from this repo. I have no idea what the root cause is though.

https://github.com/steveloughran/winutils/tree/master/hadoop-2.7.1/bin

Wenjing323 commented 3 years ago

I met the same issue when trying to write to Parquet file in Windows 10 when using winutils.exe version 2.7.1. My spark environment is spark-3.1.1-bin-hadoop2.7. After replacing winutils.exe version 2.7.1 with the version 2.8.1 below, I fixed this issue.

kristoffSC commented 2 years ago

In my case none of those work. 2.7.1, 2.8.1 and 3.0.0 All are failing with This version of %1 is not compatible with the version of Windows you're running

Windows 10 Pro 21H1 Build: 19043.1348

64-bitowy OS, procesor x64

EDIT: Ok it worked with 3.0.1 but I had to download entire bin folder content not only hadoop.dll and winutils.exe

abhinavsoti commented 2 years ago

I was also facing the same issue for any version (https://github.com/cdarlint/winutils) I try to point to (i.e. 2.7.2, 2.8.1 etc) . However, the same version (2.7.2) started working when I downloaded and extracted the complete winutils master folders from git and again pointed to 2.7.2 version. Looks like, while downloading only file, it does something to it (which I am not interested in finding out) But that's how my error got resolved. The version of spark/hadoop and winutils are as follow: Spark - spark-3.0.3 hadoop - 2.7 winutils - 2.7.2 from https://github.com/cdarlint/winutils