Following up on my previous blog post, I have watching Jeremy Howard's FastAI Deep Learning lectures online.
Here is a link to the 2018 deep learning edition of the coursework - It is absolutely free and gives great insight into practical deep learning with it's hand's on approach of diving into conceptual details by "doing".
The FastAI Library for the course has been built on top of Pytorch and provides a good top level API to start creating your deep learning models in minutes!
Setting up the library has been difficult because of it's interdependencies on different package versions and breakages caused by updates to the packages it is built upon. I am going to list out setups I followed to get this up and running.
1. Follow this guide on Medium.com to setup a Google Compute Unit on Google Cloud
2. The issue I ran into was the step where you curl setup.sh to run a bunch of commands to setup different packages. Some of the links have become outdated and which breaks the flow of the setuo script.
3. Instead, I suggest you go the conda way of installing all dependencies which is much easier and kind of takes care of itself.
This assumes you have followed Step 1 and have Google Cloud Compute unit setup.
I highly recommend you use the Google Cloud SDK to login to the cloud.
Command line to ssh on to the google cloud machine -
gcloud compute ssh <instance_name> --ssh-flag="-L" --ssh-flag="8888:127.0.0.1:8888"
NOTE: The the extra ssh flag allows us to use port 8888 as local host with the remote compute machine which makes life easier when you get to step 4.
Next, follow these commands to clone the FasiAI repository and create a conda environment with package dependencies.
git clone https://github.com/fastai/fastai.git cd fastai conda env create -f environment.yml
Now, you can start your Jupyter notebook
Here is a link to the 2018 deep learning edition of the coursework - It is absolutely free and gives great insight into practical deep learning with it's hand's on approach of diving into conceptual details by "doing".
The FastAI Library for the course has been built on top of Pytorch and provides a good top level API to start creating your deep learning models in minutes!
Setting up the library has been difficult because of it's interdependencies on different package versions and breakages caused by updates to the packages it is built upon. I am going to list out setups I followed to get this up and running.
1. Follow this guide on Medium.com to setup a Google Compute Unit on Google Cloud
2. The issue I ran into was the step where you curl setup.sh to run a bunch of commands to setup different packages. Some of the links have become outdated and which breaks the flow of the setuo script.
3. Instead, I suggest you go the conda way of installing all dependencies which is much easier and kind of takes care of itself.
This assumes you have followed Step 1 and have Google Cloud Compute unit setup.
I highly recommend you use the Google Cloud SDK to login to the cloud.
Command line to ssh on to the google cloud machine -
gcloud compute ssh <instance_name> --ssh-flag="-L" --ssh-flag="8888:127.0.0.1:8888"
NOTE: The the extra ssh flag allows us to use port 8888 as local host with the remote compute machine which makes life easier when you get to step 4.
Next, follow these commands to clone the FasiAI repository and create a conda environment with package dependencies.
git clone https://github.com/fastai/fastai.git cd fastai conda env create -f environment.yml
You then activate that environment with:
conda activate fastai
Now, you can start your Jupyter notebook
jupyter notebook --ip=0.0.0.0 --port=8888 --no-browser &
Resources -
If you are having issues with installing standalone Anaconda, TensorFlow - I recommend you follow this basic tutorialhttps://haroldsoh.com/2016/04/28/set-up-anaconda-ipython-tensorflow-julia-on-a-google-compute-engine-vm/
If you are running into installation issues, another helpful resource is the FastAI support forum.
http://forums.fast.ai/t/fastai-v0-7-install-issues-thread/24652
Note -
I ran into a bunch of jupyter notebook issues:
1. The notebook would not launch complaining about "allow_remote_access", more info is here
The solution to this was to down grade the notebook to 5.6.0
pip uninstall notebook
pip install notebook==5.6.0
2. After downgrading, it would start the notebook and display a tocken on command line. However, I could not access it through local host or Static IP Configuration and even adding a Firewall rule to allow TCP port 8888-8999.
After playing around with things I discovered the ssh-flag option you could pass to gcloud ssh to map 8888 as a local port to your machine. Step 3 outlines the command line syntax to achieve the same, and when you launch the notebook don't forget to force it to the same port (8888) as listed in Step 4.
good blogs thanks for share this
ReplyDeleteFull Stack Training in Chennai | Certification | Online Training Course | Full Stack Training in Bangalore | Certification | Online Training Course | Full Stack Training in Hyderabad | Certification | Online Training Course | Full Stack Training in Pune | Certification | Online Training Course | Full Stack Training | Certification | Full Stack Online Training Course