Hi!
First of all, thanks for the great library!
I'm using your library via nd4j and I'm getting the following error:
Caused by: java.lang.OutOfMemoryError: Cannot allocate new LongPointer(2): totalBytes = 288, physicalBytes = 5300M
at org.bytedeco.javacpp.LongPointer.<init>(LongPointer.java:88)
at org.bytedeco.javacpp.LongPointer.<init>(LongPointer.java:53)
at org.nd4j.linalg.cpu.nativecpu.ops.NativeOpExecutioner.createShapeInfo(NativeOpExecutioner.java:2021)
at org.nd4j.linalg.api.shape.Shape.createShapeInformation(Shape.java:3249)
at org.nd4j.linalg.api.ndarray.BaseShapeInfoProvider.createShapeInformation(BaseShapeInfoProvider.java:67)
at org.nd4j.linalg.api.ndarray.BaseNDArray.<init>(BaseNDArray.java:195)
at org.nd4j.linalg.api.ndarray.BaseNDArray.<init>(BaseNDArray.java:189)
at org.nd4j.linalg.cpu.nativecpu.NDArray.<init>(NDArray.java:91)
at org.nd4j.linalg.cpu.nativecpu.CpuNDArrayFactory.create(CpuNDArrayFactory.java:420)
at org.nd4j.linalg.factory.Nd4j.create(Nd4j.java:4032)
at org.nd4j.linalg.api.ndarray.BaseNDArray.create(BaseNDArray.java:2002)
at org.nd4j.linalg.api.ndarray.BaseNDArray.get(BaseNDArray.java:4284)
at org.deeplearning4j.nn.layers.recurrent.LSTMHelpers.activateHelper(LSTMHelpers.java:252)
at org.deeplearning4j.nn.layers.recurrent.LSTM.activateHelper(LSTM.java:177)
at org.deeplearning4j.nn.layers.recurrent.LSTM.activate(LSTM.java:147)
at org.deeplearning4j.nn.graph.vertex.impl.LayerVertex.doForward(LayerVertex.java:111)
at org.deeplearning4j.nn.graph.ComputationGraph.outputOfLayersDetached(ComputationGraph.java:2380)
... 23 more
Caused by: java.lang.OutOfMemoryError: Physical memory usage is too high: physicalBytes (5300M) > maxPhysicalBytes (5300M)
at org.bytedeco.javacpp.Pointer.deallocator(Pointer.java:682)
at org.bytedeco.javacpp.Pointer.init(Pointer.java:127)
at org.bytedeco.javacpp.LongPointer.allocateArray(Native Method)
at org.bytedeco.javacpp.LongPointer.<init>(LongPointer.java:80)
... 39 more
This is caused by the following call:
//... get vectorLength dynamically double[] flat = new double[vectorLength]; //... fill in flat array here... Nd4j.create(flat, new int[]{1, vectorLength, 1}, 'c') // OutOfMemory error is thrown in this line
It's weird because even the error message physicalBytes (5300M) > maxPhysicalBytes (5300M)
is wrong because these values are the same and totalBytes = 288
is lower value than physicalBytes = 5300M
.
Can you suggest any possible solution for this problem? Does it require any sort of configuration on my side or it's a bug inside javacpp library? Of course, I'm willing to improve your library if it's needed and when I'll know how to fix this issue, but I need some guidance.
Thanks for your time. I'm looking forward to your reply.
Kind Regards,
Piotr
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4