jetpacapp / DeepBeliefSDK

The SDK for Jetpac's iOS Deep Belief image recognition framework
Other
2.86k stars 437 forks source link

[iOS] jpcnn couldn't open assets-library #30

Open bishalg opened 9 years ago

bishalg commented 9 years ago

I am trying to get image from Assets using following code -

 NSString* networkPath = [[NSBundle mainBundle] pathForResource:@"jetpac" ofType:@"ntwk"];
    if (networkPath == NULL) {
    fprintf(stderr, "Couldn't find the neural network parameters file - did you add it as a resource to your application?\n");
    assert(false);
}
network = jpcnn_create_network([networkPath UTF8String]);
assert(network != NULL);
ALAsset *result = self.assets[0];
NSString *imagePath; // = [[NSBundle mainBundle] pathForResource:@"slrCamera" ofType:@"png"];
imagePath = [result.defaultRepresentation.url absoluteString];

ALAssetsLibrary *lib = [ALAssetsLibrary new];
NSURL *url = result.defaultRepresentation.url;
[lib assetForURL:url resultBlock:^(ALAsset *asset) {

  NSString *finalImagePath = [asset.defaultRepresentation.url absoluteString];

    void* inputImage = jpcnn_create_image_buffer_from_file([finalImagePath UTF8String]);
    float* predictions;
    int predictionsLength;
    char** predictionsLabels;
    int predictionsLabelsLength;
    jpcnn_classify_image(network, inputImage, 0, 0, &predictions, &predictionsLength, &predictionsLabels, &predictionsLabelsLength);
    jpcnn_destroy_image_buffer(inputImage);
    for (int index = 0; index < predictionsLength; index += 1) {
        const float predictionValue = predictions[index];
        char* label = predictionsLabels[index % predictionsLabelsLength];
        NSString* predictionLine = [NSString stringWithFormat: @"%s - %0.2f\n", label, predictionValue];
        NSLog(@"predictionLine \n %@", predictionLine);
    }
    jpcnn_destroy_network(network);
} failureBlock:^(NSError *error) {
    NSLog(@"error = %@", error);
}];

But it crashes with following error -

jpcnn couldn't open 'assets-library://asset/asset.JPG?id=CC74B475-2402-4494-8777-67E7659AB7BE&ext=JPG'

petewarden commented 9 years ago

Sorry you're hitting problems! It looks like you're trying to pass a URL to the image loading API, but the call is designed to work with a file system path, it needs to be something that the standard fopen() call can read.

I don't have an iOS example of loading an image, but you can look at what you need to do to load a network file: https://github.com/jetpacapp/DeepBeliefSDK/blob/gh-pages/examples/SimpleiOS/SquareCamViewController.m#L676

There I call pathForResource on the bundle, maybe you can do something similar for your image file?

elprl commented 9 years ago

I'm getting the same issue because the jpcnn_create_image_buffer_from_file() method is returning null. Here is my code which is the same as sample but with extra checks:

- (void)viewDidLoad
{
    [super viewDidLoad];

    NSString* networkPath = [[NSBundle mainBundle] pathForResource:@"jetpac" ofType:@"ntwk"];
    if (networkPath == NULL) {
        fprintf(stderr, "Couldn't find the neural network parameters file - did you add it as a resource to your application?\n");
        assert(false);
    }
    network = jpcnn_create_network([networkPath UTF8String]);
    assert(network != NULL);

    [self setupAVCapture];
    square = [[UIImage imageNamed:@"squarePNG"] retain];
    NSDictionary *detectorOptions = [[NSDictionary alloc] initWithObjectsAndKeys:CIDetectorAccuracyLow, CIDetectorAccuracy, nil];
    faceDetector = [[CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions] retain];
    [detectorOptions release];

    NSString* imagePath = [[NSBundle mainBundle] pathForResource:@"dirt" ofType:@"jpg"];
    NSFileManager *fm = [NSFileManager defaultManager];

    if ([fm fileExistsAtPath:imagePath isDirectory:NO]) {
        void* inputImage = jpcnn_create_image_buffer_from_file([imagePath UTF8String]); // returns null
        if (inputImage != NULL) {
            float* predictions;
            int predictionsLength;
            char** predictionsLabels;
            int predictionsLabelsLength;

            struct timeval start;
            gettimeofday(&start, NULL);
            jpcnn_classify_image(network, inputImage, 0, 0, &predictions, &predictionsLength, &predictionsLabels, &predictionsLabelsLength);
            struct timeval end;
            gettimeofday(&end, NULL);
            const long seconds  = end.tv_sec  - start.tv_sec;
            const long useconds = end.tv_usec - start.tv_usec;
            const float duration = ((seconds) * 1000 + useconds/1000.0) + 0.5;
            NSLog(@"Took %f ms", duration);

            jpcnn_destroy_image_buffer(inputImage);

            for (int index = 0; index < predictionsLength; index += 1) {
                const float predictionValue = predictions[index];
                if (predictionValue > 0.05) {
                    char* label = predictionsLabels[index % predictionsLabelsLength];
                    NSString* predictionLine = [NSString stringWithFormat: @"%s - %0.2f\n", label, predictionValue];
                    NSLog(@"%@", predictionLine);
                }
            }
        }
    }

    synth = [[AVSpeechSynthesizer alloc] init];

    labelLayers = [[NSMutableArray alloc] init];

    oldPredictionValues = [[NSMutableDictionary alloc] init];
}