Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Android: Cannot use reflections - Keras and custom layers can't be used as a result #4504

Closed
abhaybd opened this issue Jan 12, 2018 · 17 comments
Labels
Bug Bugs and problems

Comments

@abhaybd
Copy link

abhaybd commented Jan 12, 2018

I trained a model in keras 1.2.2, and I'm trying to load it into DL4J 0.9.1 on Android.

I've followed the steps at #3261 to convert the model into a zip file using ModelSerializer. I was able to import the keras model and export it into the DL4J format on desktop. However, when I try to import it on android, it fails, talking about a TensorFlowCnnToFeedForwardPreProcessor.

This is the relevant line in the stacktrace:
Caused by: java.lang.RuntimeException: org.nd4j.shade.jackson.databind.JsonMappingException: Could not resolve type id 'TensorFlowCnnToFeedForwardPreProcessor' into a subtype of [simple type, class org.deeplearning4j.nn.conf.InputPreProcessor]: known type ids = [InputPreProcessor, binomialSampling, cnnToFeedForward, cnnToRnn, composableInput, feedForwardToCnn, feedForwardToRnn, rnnToCnn, rnnToFeedForward, unitVariance, zeroMean, zeroMeanAndUnitVariance]

The full stacktrace is here: https://pastebin.com/ajTDzPS9
My build.gradle file is here: https://pastebin.com/HmhS9kJK

I'm trying to load the model using ModelSerializer.restoreMultiLayerNetwork().

The github repo with all the code is here: https://github.com/coolioasjulio/devanagari-excercise-app/tree/with-model-import

abhaybd added a commit to abhaybd/devanagari-excercise-app that referenced this issue Jan 12, 2018
Still fails when trying to import the model though. Opened up an issue on DL4J here: deeplearning4j/deeplearning4j#4504
@saudet
Copy link
Contributor

saudet commented Jan 12, 2018

Looks like TensorFlowCnnToFeedForwardPreProcessor is missing from the list here:
https://github.com/deeplearning4j/deeplearning4j/blob/master/deeplearning4j-nn/src/main/java/org/deeplearning4j/nn/conf/InputPreProcessor.java#L39
Is it just a matter of adding it there @AlexDBlack ?

@AlexDBlack
Copy link
Contributor

Probably not, unfortunately - they are in different modules - hence we can't reference the Keras classes in InputPreProcessor annotations.
It should in theory get picked up via reflection (that's how it works normally) but whether reflection works properly on Android is another thing... (removing reflection entirely here is on my to-do list, but that won't happen right away).

@saudet
Copy link
Contributor

saudet commented Jan 12, 2018

Ah, reflection. @coolioasjulio You might need to disable ProGuard on these classes to get reflection to work properly.

@AlexDBlack
Copy link
Contributor

If that doesn't work, I can probably work out a manual (and ugly) workaround to manually register the Keras classes for JSON (de)serialization. But I'd recommend trying to get reflection working properly first :)

@abhaybd
Copy link
Author

abhaybd commented Jan 12, 2018

@saudet How do you disable proguard? I added this to proguard-rules.pro:

-dontoptimize
-dontshrink
-dontusemixedcaseclassnames
-dontskipnonpubliclibraryclasses
-dontpreverify
-verbose

However, the same error showed up. Then I tried setting this in the build.gradle in android > buildTypes > release:

useProguard false

Even with this on, the same error persists. Am I disabling proguard correctly?

@saudet
Copy link
Contributor

saudet commented Jan 12, 2018

Probably safe to say it's disabled. Is there anything in the log that shows something related to errors with reflection?

@abhaybd
Copy link
Author

abhaybd commented Jan 12, 2018

@saudet No, it doesn't say anything about reflection or proguard. It just says:
Caused by: java.lang.RuntimeException: org.nd4j.shade.jackson.databind.JsonMappingException: Could not resolve type id 'TensorFlowCnnToFeedForwardPreProcessor' into a subtype of [simple type, class org.deeplearning4j.nn.conf.InputPreProcessor]: known type ids = [InputPreProcessor, binomialSampling, cnnToFeedForward, cnnToRnn, composableInput, feedForwardToCnn, feedForwardToRnn, rnnToCnn, rnnToFeedForward, unitVariance, zeroMean, zeroMeanAndUnitVariance]

It then prints out the entire JSON structure of the model, and then references some classes. After disabling proguard the stack trace hasn't changed at all from before, even after cleaning and rebuilding. The full stack trace is at the top in the original issue, if you want to read the whole thing.

@saudet
Copy link
Contributor

saudet commented Jan 12, 2018

Use Logback, Log4j, or something to display debug messages from DL4J and see what it says.
The dl4j-examples uses Logback, for example:
https://github.com/deeplearning4j/dl4j-examples/blob/master/dl4j-examples/pom.xml#L142
https://github.com/deeplearning4j/dl4j-examples/tree/master/dl4j-examples/src/main/resources

@abhaybd
Copy link
Author

abhaybd commented Jan 14, 2018

@saudet I set up logback and this is what I got: (these are only the lines printed by logback)

I/System.out: 22:24:00.822 [AsyncTask #1] INFO org.nd4j.linalg.factory.Nd4jBackend - Loaded [CpuBackend] backend
I/System.out: 22:24:00.872 [AsyncTask #1] WARN org.reflections.Reflections - given scan urls are empty. set urls in the configuration
I/System.out: 22:24:01.009 [AsyncTask #1] INFO org.nd4j.nativeblas.NativeOpsHolder - Number of threads used for NativeOps: 4
I/System.out: 22:24:01.014 [AsyncTask #1] WARN org.reflections.Reflections - given scan urls are empty. set urls in the configuration
I/System.out: 22:24:01.430 [AsyncTask #1] INFO org.nd4j.nativeblas.Nd4jBlas - Number of threads used for BLAS: 4
I/System.out: 22:24:01.431 [AsyncTask #1] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Backend used: [CPU]; OS: [Linux]
I/System.out: 22:24:01.431 [AsyncTask #1] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Cores: [8]; Memory: [0.3GB];
I/System.out: 22:24:01.432 [AsyncTask #1] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Blas vendor: [OPENBLAS]
I/System.out: 22:24:01.636 [AsyncTask #1] WARN org.reflections.Reflections - given scan urls are empty. set urls in the configuration

So it looks like it has something to do with reflection. What does it mean given scan urls are empty. set urls in the configuration? Is that something I can do on my end?

@AlexDBlack
Copy link
Contributor

given scan urls are empty. set urls in the configuration

I'm not 100% sure, but I suspect this indicates that the reflections library can't find (or access?) JARs on the classpath. Normally it'll get a list of dependency JARs to scan, and then scan all of them for classes/resources. It's not something that the user should be messing with, generally.

@abhaybd
Copy link
Author

abhaybd commented Jan 15, 2018

I've already turned off proguard to see if that helps with the reflections. What else can I do?

Is this a fix that would have to be implemented in the library? (not on my end)

@AlexDBlack
Copy link
Contributor

removing reflection entirely here is on my to-do list, but that won't happen right away

That is something we can do to fix it, but it's not a simple/easy fix...

Beyond that, I'm not sure why reflections is having trouble here.

As to the ugly workaround that I mentioned previously - you can try this before loading your network:

import org.nd4j.shade.jackson.databind.jsontype.NamedType;
List<NamedType> types = new ArrayList<>();
types.add(new NamedType(TensorFlowCnnToFeedForwardPreProcessor.class, "TensorFlowCnnToFeedForwardPreProcessor"));
//Do this "new NamedType(...) for the other layers and preprocessors you are having trouble with
NeuralNetConfiguration.reinitMapperWithSubtypes(types);

Note that I have not tested this, but in theory this (or something close) should work.

@saudet
Copy link
Contributor

saudet commented Jan 15, 2018

Looks like Reflections doesn't work so well on Android, so yeah we shouldn't be relying on it...
ronmamo/reflections#127

@saudet saudet closed this as completed Jan 15, 2018
@saudet saudet reopened this Jan 15, 2018
@abhaybd
Copy link
Author

abhaybd commented Jan 16, 2018

@AlexDBlack Yeah, that works! I needed to add deeplearning4j-modelimport to my dependencies to access TensorFlowCnnToFeedForwardPreProcessor.class, but after that, the model is working like normal!

Thanks so much, I was having a hell of a time trying to get this to work!

@abhaybd abhaybd closed this as completed Jan 16, 2018
@AlexDBlack
Copy link
Contributor

Great. I'll re-open this actually, until we have a confirmed real/proper solution for it (i.e., no more reflections required)

@AlexDBlack AlexDBlack reopened this Jan 16, 2018
@AlexDBlack AlexDBlack changed the title Error importing keras 1.2.2 model in DL4J 0.9.1 on Android Android: Cannot use reflections - Keras and custom layers can't be used as a result Jan 16, 2018
@AlexDBlack AlexDBlack added the Bug Bugs and problems label Jan 29, 2018
@AlexDBlack
Copy link
Contributor

Reflections library has been removed from DL4J entirely - addressed here: #4956 and #4950

This should no longer be an issue.

abhaybd added a commit to abhaybd/devanagari-excercise-app that referenced this issue Sep 18, 2018
Still fails when trying to import the model though. Opened up an issue on DL4J here: deeplearning4j/deeplearning4j#4504
@lock
Copy link

lock bot commented Sep 22, 2018

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Sep 22, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Bug Bugs and problems
Projects
None yet
Development

No branches or pull requests

3 participants