Sold Out

5 day Deep Learning Bootcamp, November 2017: ALS Fund Raising

Event Information

Share this event

Date and Time



Microsoft ILDC Auditorium

13 Shenkar st.



View Map

Refund Policy

Refund Policy

Refunds up to 7 days before event

Friends Who Are Going
Event description



Latest even information is always here:

About our hosts:

Microsoft and Nice Systems are hosting this event.

Microsoft will host the first 3 days out of the 5-day boot camp whereas Nice will host the last 2 days.

Microsoft will also provide 250 GPU accounts for the duration of the 5-day boot camp. A separate invitation will be sent for the creation of the accounts.


The Bootcamp is organized such that it amalgamates, “Theory” and “Practice” – identifying that a deep learning scientist desires a survey of concepts combined with a strong application of practical techniques through labs. Primarily, the foundational material and tools of the Data Science practitioner are presented via Sk-Learn. Topics continue rapidly into exploratory data analysis and classical machine learning, where the data is organized, characterized, and manipulated. From day two, the students move from engineered models into 4 days of Deep Learning.

Target Audience and Prerequisites:

Experience in Python programming is a must, basic knowledge of calculus, linear algebra, and probability theory. We also assume familiarity with neural networks at the level of an intro AI class (such as one from the Russel and Norvig book).

Attendees are expected to bring their own laptops for the hands-on practical work.

A remote GPU by Microsoft will be provided for each participant.



A deep learning docker image based on Ubuntu Linux 16.04:

• A computer with GPU that can run CUDA 8, alternatively, setup the above-named docker file in Google cloud, see instructions here:

Workshop Agenda: (every day from ~16:30 to ~21:30)

Day 01 Gathering:

Few words by the Admins.

• About the Israeli ALS society, Efrat Carmi CEO.

Few words by our Host Microsoft.

Day 01: Practical machine learning with Python and sk-learn pipelines:

Binary classification using Logistic Regression (Theory)

• Loading data from CSV’s and HD5 (Lab)

• Binary classification using Logistic Regression (Lab)

• Feature generation and data augmentation (Lab)

• Feature reduction using PCA and XGBOOST (Lab)

Hyper-parameter optimization using sk-learn grid and rand search (Lab)

• K-fold Cros validation (Lab)

• Using sk-learn pipelines (Lab)

• Drawing ROC_AUC and LOG_LOSS (Lab)

• Ensamebling models (Lab), Majority Voting Classifier.

• Solving the challenge, A binary classification problem (Lab)

Day 02: Neural networks using the GPU, PyCUDA, PyTorch and Matlab:

This intensive session will discuss CUDA, GPU based tensors and Python acceleration. PyTorch supports GPU tensors natively and using PyCUDA, one can run any CUDA code on the GPU. Matlab has native CUDA generation support. Most developers are aware that some algorithms can be run on a GPU, instead of a CPU, and see orders of magnitude speedups. However, many people assume that:

1. Only specialist areas like deep learning are suitable for GPU

2. Learning to program a GPU takes years of developing specialist knowledge

It turns out that neither assumption is true! Nearly any non-recursive algorithm that operates on datasets of 1000+ items can be accelerated by a GPU. And recent libraries like PyTorch/PyCUDA/Matlab make it nearly as simple to write a GPU accelerated algorithm as a regular CPU algorithm.

Part one (focus GPU based tensors):

• Installing a GPU based deep learning environment with docker (Lab)

• The GPU and the CPU (Theory) and Architectural building blocks of GPU computing (Theory)

• Basics of CUDA, threads, blocks, grids.

GPU based Tensors in PyCUDA and PyTorch

• Image processing on the GPU using PyCUDA (Lab)

• Image processing on the GPU using PyTorch (Lab)

• Deep Learning in MATLAB on the GPU using CUDA (Lab)

Part two (focus deep learning using PyTorch):

• Single layer perceptron, Feed forward networks, Reverse mode automatic differentiation (Theory)

• Neural networks from scratch using PyTorch (Lab)

• Automatic differentiation using pure Autograd and PyTorch (Lab)

• Convolutional Neural Networks using PyTorch (Lab)

Using Python MPI and PyTorch for multiple GPU’s (Lab)

Day 03: Applied Deep Learning in Python

Deep Learning Introduction

Network architecture
- Activation functions
- Data preprocessing
- Weight initialization:
+ Batch normalization
- Convolutions- Pooling

Deep Learning Advanced-

Best practices:
+ Adam
+ Learning rates
- Transfer learning
- Object detection
- Object segmentation
- Generative Models (DGAN)
- Text-based use cases (RNNs)

Day 04: Convolutional Neural Networks using Keras

In this session, you will use CNN’s to compete in an active Kaggle competition. Each and every step would be explained in detail.

This hands-on session will take you from 0 to 100 in Deep Learning with Keras. Our aim is to teach the fundamentals of deep learning with Convolutional Neural Networks (CNN) based on modern techniques using the Keras API and the Tensorflow backend. By the end participants will know how to build deep learning models, how to train them, what to avoid during training, what to check during training and how to perform model inference, especially for image based problems.

Day 05: Applied Deep Reinforcement Learning in Python

RL Basics

Policy Gradients

Actor-Critic Algorithms


Evolution Strategies

RL trouble-shooting and debugging strategies

Current research frontiers

There will be tutorials sessions for all topics, followed by hands-on lab sessions using OpenAI Gym.

Parial list of workshop Instructor and Lab Trainers:

Nathaniel Shimoni
Nir Ben-Zvi
Gidi Shperber
Natan Katz
Yam Peleg
Amit Mandelbaum
Yochay Ettun
Eyal Gruss
רועי פן
Bar Hilleli

Shir Meir Lador

Shlomo Kashani.

Share with friends

Date and Time


Microsoft ILDC Auditorium

13 Shenkar st.



View Map

Refund Policy

Refunds up to 7 days before event

Save This Event

Event Saved