I have followed everything without problems up to the point where I have to update the environment description in environment.yml for mx-net with cuda support. My current description is mxnet==1.4.0 which I changed to mxnet-cu101==1.4.0 as my cuda version is 10.1.
However, when I run: conda env update -f environment.yml
I get the following error: Could not find a version that satisfies the requirement mxnet-cu101 (from -r /home/nenko/miniconda3/d2l-en/condaenv.alwxqubp.requirements.txt (line 1)) (from versions: ) No matching distribution found for mxnet-cu101 (from -r /home/nenko/miniconda3/d2l-en/condaenv.alwxqubp.requirements.txt (line 1))
Hi! Any help here would be much appreciated. I have Cuda 10.0 installed and running, so when I updated my environment.yml I changed mxnet to mxnet-cu100. However, when I run the code on jupyter notebook:
import mxnet as mx
I receive the error:
libcudart.so.10.0: cannot open shared object file: No such file or directory
I think the best way to setup the environment would be Docker container with preinstalled packages. When you discuss Linux you refer to apt, but Debian/Ubuntu is not the only flavor of Linux.
Hi, I cloned the repo and noticed all the chapter notebooks are saved as .md files. Might be a silly question with a simple fix, but how do I run these? I installed this a few weeks ago and they were .ipynb as normal. Do I have to change them somehow? How can I do this with all of them? Why are they saved as .md in the first place?
I have solved the issue. The github is the chapter markdowns, the actual code for the book should be gotten by following the install instructions. I simply thought one could fork the repo and run the code via github.
I donât have NVIDIA . I have intel graphic .So it showing error on install cuda.How i further proceed in installation process .I am also not able to get install mxnet it is showing âCould not install packages due to an EnvironmentError: [WinError 5] Access is denied: âc:\programdata\anaconda3\lib\site-packages\idna-2.8-py3.7.egg-info\dependency_links.txtâ
Consider using the --user option or check the permissions.â.
This appears broken for me. Iâm on Ubuntu 18.04 with Python 3.6.5 using 4.5.4. Iâm trying to follow the instructions, but when I test the installation by running âimport d2lâ in the notebook I get:
would be nice if the d2l downloaded code was labeled with the corresponding chapters.
For example, 2.1.1: getting started is located in
d2l-en/chapter_crashcourse/ndarray.ipynb
If it was in
d2l-en/2.crashcourse/1.ndarray.ipynb
it would be a lot easier to find.
Or atleast have the notebooks have the same name as the chapter: 2.1 Data manipulation â data_manipulation.ipynb instead of ndarray.ipynb
Very neat project !
Soon all hard science textbook will be in this format.
Just one question for now:
is there an index available for the downloaded version ?
In âmxnet-the-straight-dope-masterâ there was a README.md file
that could be converted to ipynb and worked as en Index.
When trying to install the d2l dependency on google colab , I am getting a prompt toolkit error . The prompt toolkit error is because colab jupyter kernel uses an older version while the d2l uses the newer version. How do I solve this?
Hi Iâm new to this book and am excited to get into learning. I have followed the steps for installation and am able to get jupyter notebook to launch properly however in the first part of lesson two when I try to run the code âfrom mxnet import mxâ I get an OS error stating that I have a missing module, but I have already pip installed mxnet. Can someone please let me know what is going on it is extremely frustrating.
Could anyone help with this mismatch error? I got it to work previously with @ThomasDelteil 's advice on April 10th, but following the advice again now has not resolved the issue.
The exact error I get, upon trying to import mxnet as mx in the notebook, is the following:
Hi, I have Spyder (Python 3.7.4) open and am running the first code lines. âjupyter notebookâ always seems to give me error: SyntaxError: invalid syntax. Anyone knows why?