!nvidia-smi command not found

I installed the GPU drivers and CUDA, etc. but I still can’t seem to run !nvidia-smi in my Jupyter Notebook without getting a “command not found” error. Does this have to do with where my framework is located? I’m really lost

You have to SSH into the instance via your terminal and you should be able to run your command there.

How do you SSH into the instance?

Are you using an EC2 instance or running the notebook on your local device?
If your using an EC2 instance try this tutorial: https://towardsdatascience.com/setting-up-and-using-jupyter-notebooks-on-aws-61a9648db6c5
I’m not sure what to do on your local device though.

Here’s an example for where you can find nvidia-smi on a p2.xlarge instance running CUDA 10.0.

(gluon) ubuntu@ip-172-31-63-98:~$ which nvidia-smi
(gluon) ubuntu@ip-172-31-63-98:~$ nvidia-smi
Tue Jan 29 00:51:54 2019       
| NVIDIA-SMI 410.79       Driver Version: 410.79       CUDA Version: 10.0     |
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|   0  Tesla K80           On   | 00000000:00:1E.0 Off |                    0 |
| N/A   53C    P0    60W / 149W |    259MiB / 11441MiB |      0%      Default |
| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|    0     15102      C   python                                       248MiB |