You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been trying to create predictors using the GPU, but can't seem to make it work. For example, using the provided test script (tests/pr3/test.go), the go script compiles and runs just fine as-is. However, if I change line 37:
mxnet.Device{mxnet.CPU_DEVICE, 0},
to:
mxnet.Device{mxnet.GPU_DEVICE, 0},
the code will compile, but fail to create the predictor, with the following error:
~/golang/src/github.com/songtianyi/go-mxnet-predictor/tests/pr3$ ./test
[14:13:20] src/nnvm/legacy_json_util.cc:209: Loading symbol saved by previous version v0.9.4. Attempting to upgrade...
[14:13:20] src/nnvm/legacy_json_util.cc:217: Symbol successfully upgraded!
panic: file exists
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0x491de2]
This is with a CUDA enabled build of mxnet 1.2.0 and a K80 GPU. The mxnet examples (both python and C++) work fine and can use the GPU.
I presume there's something obvious I'm missing but I'm not too familiar with mxnet itself, so you might have a better insight. I can see that the mxnet.GPU_DEVICE effectively becomes the int value 2 through the enum, and that seems to be the value corresponding to the GPU device type in mxnet indeed, and GPU 0 is the only one available on the machine I'm using.
The text was updated successfully, but these errors were encountered:
I've been trying to create predictors using the GPU, but can't seem to make it work. For example, using the provided test script (tests/pr3/test.go), the go script compiles and runs just fine as-is. However, if I change line 37:
mxnet.Device{mxnet.CPU_DEVICE, 0},
to:
mxnet.Device{mxnet.GPU_DEVICE, 0},
the code will compile, but fail to create the predictor, with the following error:
This is with a CUDA enabled build of mxnet 1.2.0 and a K80 GPU. The mxnet examples (both python and C++) work fine and can use the GPU.
I presume there's something obvious I'm missing but I'm not too familiar with mxnet itself, so you might have a better insight. I can see that the mxnet.GPU_DEVICE effectively becomes the int value 2 through the enum, and that seems to be the value corresponding to the GPU device type in mxnet indeed, and GPU 0 is the only one available on the machine I'm using.
The text was updated successfully, but these errors were encountered: