microsoft / CNTK

Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
https://docs.microsoft.com/cognitive-toolkit/
Other
17.51k stars 4.28k forks source link

BrainScript Exception occurred: ConfigValue (bool): boolean argument expected #558

Closed whitmark closed 8 years ago

whitmark commented 8 years ago

I am attempting to convert a working NDL script to BrainScript and am getting the exception below:

`EXCEPTION occurred: ConfigValue (bool): boolean argument expected`

that is coming from "Source/Common/Include/Config.h" and it is unclear as to what the issues is.See runtime output/call stack and config file below. I am using the Release_CpuOnly edition.

Would appreciate any thoughts on what this might be. Thanks.

Runtime Output

<<<<<<<<<<<<<<<<<<<< RAW CONFIG WITH ALL VARIABLES RESOLVED <<<<<<<<<<<<<<<<<<<<
>>>>>>>>>>>>>>>>>>>> PROCESSED CONFIG WITH ALL VARIABLES RESOLVED >>>>>>>>>>>>>>>>>>>>
configparameters: dlt_dnn01_up_bs.cntk:command=Train
configparameters: dlt_dnn01_up_bs.cntk:ConfigDir=../Config
configparameters: dlt_dnn01_up_bs.cntk:DataDir=../Data
configparameters: dlt_dnn01_up_bs.cntk:deviceId=-1
configparameters: dlt_dnn01_up_bs.cntk:hiddenDimension=12
configparameters: dlt_dnn01_up_bs.cntk:labelDimension=2
configparameters: dlt_dnn01_up_bs.cntk:ModelDir=../Output/Models
configparameters: dlt_dnn01_up_bs.cntk:modelPath=../Output/Models/dlt_dnn01_up.dnn
configparameters: dlt_dnn01_up_bs.cntk:OutputDir=../Output
configparameters: dlt_dnn01_up_bs.cntk:outputNodeNames=ScaledLogLikelihood
configparameters: dlt_dnn01_up_bs.cntk:precision=float
configparameters: dlt_dnn01_up_bs.cntk:RootDir=..
configparameters: dlt_dnn01_up_bs.cntk:sampleDimension=23
configparameters: dlt_dnn01_up_bs.cntk:traceLevel=1
configparameters: dlt_dnn01_up_bs.cntk:Train=[
    action="train"
    BrainScriptNetworkBuilder=[
        SDim = 23
        HDim = 12
        LDim = 2
        features = Input (SDim)
        labels = Input (LDim)
        W0 = Parameter (HDim, SDim) ; b0 = Parameter (HDim, 1)
        W1 = Parameter (LDim, HDim) ; b1 = Parameter (LDim, 1)
        r = RectifiedLinear (W0 * features + b0)
        z = W1 * r + b1
        P = Softmax (z)
        ce = CrossEntropyWithSoftmax (labels, z)
        errs = ErrorPrediction (labels, z)
        featureNodes    = (features)
        labelNodes      = (labels)
        criterionNodes  = (ce)
        evaluationNodes = (errs)
        outputNodes     = (z:P)
    ]
    SGD = [
epochSize = 0                  
minibatchSize = 32             
learningRatesPerMB = 0.1       
        momentumPerMB = 0.9
L2RegWeight = 0                
dropoutRate = 0.0              
        maxEpochs = 30
    ]
    reader = [
        readerType = "CNTKTextFormatReader"
        file = "Train-DLT-dnn01-up.txt"
        randomize = "auto"
        maxErrors = 10
        traceLevel = 1
        input = [
            features = [
              alias = "F"
              dim = 23
              format = "dense"
            ]
            labels = [
                alias = "L"
                dim = 2
                format = "dense"
            ]
        ]
    ]
]
<<<<<<<<<<<<<<<<<<<< PROCESSED CONFIG WITH ALL VARIABLES RESOLVED <<<<<<<<<<<<<<<<<<<<
Commands: Train
Precision = "float"
CNTKModelPath: ../Output/Models/dlt_dnn01_up.dnn
CNTKCommandTrainInfo: Train : 30
CNTKCommandTrainInfo: CNTKNoMoreCommands_Total : 30
##############################################################################
#                                                                            #
# Action "train"                                                             #
#                                                                            #
##############################################################################
CNTKCommandTrainBegin: Train
[CALL STACK]
    > Microsoft::MSR::CNTK::ConfigParameters::operator()<bool>  
    - Microsoft::MSR::CNTK::TextConfigHelper::  TextConfigHelper
    - Microsoft::MSR::CNTK::CNTKTextFormatReader::  CNTKTextFormatReader
    - std::make_shared<Microsoft::MSR::CNTK::CNTKTextFormatReader,std::shared_ptr<Microsoft::MSR::CNTK::HeapMe
moryProvider>,Microsoft::MSR::CNTK::ConfigParameters const & __ptr64>  
    - <lambda_5c1aff6001d246c70ac921b62b1bf871>::<helper_func_cdecl>  
    - Microsoft::MSR::CNTK::ReaderShim<
float>::  Init
    - Microsoft::MSR::CNTK::DataReader::DataReader<Microsoft::MSR::CNTK::ConfigParameters>  
    - std::make_shared<Microsoft::MSR::CNTK::DataReader,Microsoft::MSR::CNTK::ConfigParameters & __ptr64>  
    - CreateObject<Microsoft::MSR::CNTK::DataReader>  
    - DoTrain<Microsoft::MSR::CNTK::ConfigParameters,float>  
    - DoCommands<float>  
    - wmainOldCNTKConfig
    - wmain1
    - wmain
    - __tmainCRTStartup
    - BaseThreadInitThunk
EXCEPTION occurred: ConfigValue (bool): boolean argument expected

CNTK Config

# Copyright (c) Microsoft. All rights reserved.
# Licensed under the MIT license. See LICENSE file in the project root for full license information.

# currentDirectory=$(SolutionDir)/<path to corresponding data folder> 
RootDir = ".."
ConfigDir = "$RootDir$/Config"
DataDir   = "$RootDir$/Data"
OutputDir = "$RootDir$/Output"
ModelDir  = "$OutputDir$/Models"

# which commands to run
command=Train

#required...
precision = "float"
modelPath="$OutputDir$/Models/dlt_dnn01_up.dnn" # where to write the model to

# uncomment the following line to write logs to a file 
# stderr = "$OutputDir$/dtl_dnn01_up_out"
#numMBsToShowResult=500
traceLevel=1
deviceId=-1            # CPU < 0
outputNodeNames = ScaledLogLikelihood

sampleDimension=23     # sample dimensions
hiddenDimension=12     # hidden dimensions
labelDimension=2       # label dimensions

#######################################
#  TRAINING CONFIG                    #
#######################################
Train=[
    action="train"

    # network description
    BrainScriptNetworkBuilder=[

        # feature and label dimensions
        SDim = $sampleDimension$
        HDim = $hiddenDimension$
        LDim = $labelDimension$

        # set input tensors
        features = Input (SDim)
        labels = Input (LDim)

        # model parameters to learn
        W0 = Parameter (HDim, SDim) ; b0 = Parameter (HDim, 1)
        W1 = Parameter (LDim, HDim) ; b1 = Parameter (LDim, 1)

        # computation network formulas
        r = RectifiedLinear (W0 * features + b0)
        z = W1 * r + b1

        # network outputs
        P = Softmax (z)
        ce = CrossEntropyWithSoftmax (labels, z)
        errs = ErrorPrediction (labels, z)

        # special nodes
        featureNodes    = (features)
        labelNodes      = (labels)
        criterionNodes  = (ce)
        evaluationNodes = (errs)
        outputNodes     = (z:P)

    ]

    #Stochastic Gradient Descent
    SGD = [
        epochSize = 0                  # =0 means size of the training set
        minibatchSize = 32             # recommended to use sizes in multiples of 32
        learningRatesPerMB = 0.1       # learning rates per MB
        momentumPerMB = 0.9
        L2RegWeight = 0                # =0 default
        dropoutRate = 0.0              # =0 default
        maxEpochs = 30
    ]

    # parameter values for the reader
    reader = [
        readerType = "CNTKTextFormatReader"
        file = "Train-DLT-dnn01-up.txt"
        randomize = "auto"
        maxErrors = 10
        traceLevel = 1

        input = [
            features = [
              alias = "F"
              dim = $sampleDimension$
              format = "dense"
            ]
            labels = [
                alias = "L"
                dim = $labelDimension$
                format = "dense"
            ]
        ]
    ]
]
eldakms commented 8 years ago

randomize=true in the reader.

whitmark commented 8 years ago

That's it . . . thank you!