Closed jrp2014 closed 4 years ago
The problem here is that the singletons library (which due to its use of TH which is tightly locked to the GHC version) changed how the Sing
type was implemented. It used to use a data family and now uses a type family.
I have a fix for this, but that fix causes other issues which I have not yet resolved.
For now the solution is to use ghc 8.6.5.
Thanks, Erik. Good luck!
Many thanks. I'll give it a whirl. (There is a warning, on compilation, about dodgy exports which I assume is harmless.)
grenade itself builds, tests and benches OK with 8.8.1. Great.
Going into examples, several library upper bounds are a bit too restrictive. Running cabal build --allow-newer
produces
[ 2 of 38] Compiling Grenade.Core.Shape ( src/Grenade/Core/Shape.hs, dist/build/Grenade/Core/Shape.o )
src/Grenade/Core/Shape.hs:174:36: error:
Ambiguous occurrence ‘natVal’
It could refer to
either ‘Data.Singletons.TypeLits.natVal’,
imported from ‘Data.Singletons.TypeLits’ at src/Grenade/Core/Shape.hs:33:1-41
(and originally defined in ‘GHC.TypeNats’)
or ‘GHC.TypeLits.natVal’,
imported from ‘GHC.TypeLits’ at src/Grenade/Core/Shape.hs:37:1-29
|
174 | let rows = fromIntegral $ natVal (Proxy :: Proxy rows)
| ^^^^^^
src/Grenade/Core/Shape.hs:175:36: error:
Ambiguous occurrence ‘natVal’
It could refer to
either ‘Data.Singletons.TypeLits.natVal’,
imported from ‘Data.Singletons.TypeLits’ at src/Grenade/Core/Shape.hs:33:1-41
(and originally defined in ‘GHC.TypeNats’)
or ‘GHC.TypeLits.natVal’,
imported from ‘GHC.TypeLits’ at src/Grenade/Core/Shape.hs:37:1-29
|
175 | columns = fromIntegral $ natVal (Proxy :: Proxy columns)
| ^^
which suggests that the CPP isn't taking (to hide the natVal). Happy to investigate further, but you may have a quick fix.
I am not seeing that with ghc 8.8.2, but I am seeing another warning which I will fix in the next day or two.
This library depends heavily on a bunch of relatively new GHC features and libraries that are still changing frequently. Supporting more that one minor version (eg 8.8.1 vs 8.8.2) is probably not worth the effort. I will attempt to support the latest minor version of recent GHCs. Eg see : https://github.com/HuwCampbell/grenade/blob/master/.travis.yml#L5-L9
(Yes, I was actually using 8.8.2) Are you saying that the examples compile (cabal build) out of the box, with no change in version bounds?
The tests and examples still have warnings. I intend to clean that up. The examples should build.
The library itself should build warning free, but there currently is a single warning.
These two lines in the .travis.yaml
file:
- cabal-3.0 configure --enable-tests
- cabal-3.0 build all
ensures that examples and tests are built in CI.
Thanks. That worked for me.
How much memory does mnist need? Running on Ubuntu eoan, I get:
cabal run mnist mnist_train.csv mnist_test.csv
Up to date
Training convolutional neural network...
Killed
I assume that the OS killed minst because it tried too much memory, but that's just a guess.
Where do the mnist_train.csv
and mnist_test.csv
files come from?
iirc they're renormalised versions of the kaggle ones.
I'll see if I can dig my ones up and chuck them on a branch.
That would be v helpful. Or at least document what is expected.
Mine came from https://pjreddie.com/projects/mnist-in-csv/ (I don't want to have to subscribe to something to get test data.)
Better still would be to use the original data via ByteString.readFile.
Many thanks.
PS: dmesg confirms that mnist was killed because it went oom
@jrp2014 did you run the CSV files from that page through the Python code?
No. I thought that they were the output, the translation from the original binary byte strings
Are the file values in the file normalized to [0..1.0]
? I am pretty sure they need to be. @HuwCampbell ?
No, the ones that I used (referenced above) are just rows [0..255] Ints. Since part of the point of the MNIST dataset is comparability, I'd be inclined to start from the originals at http://yann.lecun.com/exdb/mnist/ (format described at the bottom of the page) and work from there. It's easy enough to decode even the compressed .gz data into the requisite format using ByteString. The only reason that the csv files exist was that courses that used them didn't want to have to teach about file decompression, etc, before getting on to the good stuff.
When I
cabal build
against the current base, and ghc 8.8.2, I getPlease advise.