My experiments with speeding up a Deep Belief network using GPU acceleration (CUDA). — srikarpv

In my 7th semester in engineering i and my friends wanted to work on deep neural networks, we were interested mainly because we started working on a project which required knowledge about DNN’s like DBN( Deep Bayesian networks) , SAE/SDE ( Staked auto encoder’s) etc . The project was basically to implement this paper. It […]

via My experiments with speeding up a Deep Belief network using GPU acceleration (CUDA). — srikarpv

My experiments with speeding up a Deep Belief network using GPU acceleration (CUDA).

In my 7th semester in engineering i and my friends wanted to work on deep neural networks, we were interested mainly because we started working on a project which required knowledge about DNN’s like DBN( Deep Bayesian networks) , SAE/SDE ( Staked auto encoder’s) etc .

The project was basically to implement this paper. It was to predict traffic flow in a certain road/ highway .We learnt various techniques of neural network prediction out of which a Deep Belief Network aka Deep Bayesian network was one of  my favorite.

Although the paper was based on a stacked auto encoders which i will be posting in my next blog , i wanted to experiment with DBN’s.

Here’s a little introduction about DBN’s ( Deep Belief Networks) from wiki:

In machine learning, a deep belief network (DBN) is a generative graphical model, or alternatively a type of deep neural network, composed of multiple layers of latent variables (“hidden units”), with connections between the layers but not between units within each layer.

When trained on a set of examples in an unsupervised way, a DBN can learn to probabilistically reconstruct its inputs. The layers then act as feature detectors on inputs. After this learning step, a DBN can be further trained in a supervised way to perform classification.

The Easiest way to work with your DBN and test it on a data is using  Public data sets like MNIST .  I have created my own DBN by learning from here. deeplearning.net is especially useful for beginners wanting to get their hands on deep neural network algorithms , i think it gives the best insights and is really well documented and for a beginner using their tutorials is one of the best way to get into the world of Deep Neural Networks. Scikitlearn is another such tutorial page that gives you detailed insight into algorithms and code alike.

For this particular experiment i used(requirement) :

 

pyprind is a good python tool that provides a progress bar and a percentage indicator object that let you track the progress of a loop structure or other iterative computation.This is especially useful when dealing with large data sets.

here’s the code of the program which uses CUDAmat to perform GPU calculations.

 

[code language="python"]
#coding: utf-8

from __future__ import division
import time
import numpy as np
import cudamat as cm
import pyprind

class RestrictedBoltzmanMachine(object):

 def __init__(self, n_hidden, learning_rate=0.1, momentum=0.9, n_epochs=30, batch_size=128, k=1, title=''):
 self.n_hidden = n_hidden
 self.learning_rate = learning_rate
 self.momentum = momentum
 self.n_epochs = n_epochs
 self.batch_size = batch_size
 self.k = k
 self.title = title

 def transform(self, v, h):
 """
 Parameters:
 v : the visible input activation
 h : the target to write the hidden activation
 """
 cm.dot(self.W.T, v, target = h)
 h.add_col_vec(self.hidden_bias)
 h.apply_sigmoid()

 def sample_hidden(self, v, h_mean, h):
 """
 Parameters:
 v : the visible input activation
 h_mean : the target to write the hidden activation
 h: the target to write the hidden sample
 """
 self.transform(v, h_mean)
 h.fill_with_rand()
 h.less_than(h_mean)

 def sample_visible(self, h, v_mean, v):
 """
 Parameters:
 h : the hidden activation
 v_mean : the target to write the visible activation
 v: the target to write the visible sample
 """
 self.reverse_transform(h, v_mean)
 v.fill_with_rand()
 v.less_than(v_mean)

 def reverse_transform(self, h, v):
 """
 Parameters:
 h : the hidden activation
 v : the target to write the visible activation
 """
 cm.dot(self.W, h, target = v)
 v.add_col_vec(self.visible_bias)
 v.apply_sigmoid()

 def fit(self, input, verbose=1):
 """
 Parameters
 ----------
 input : CUDAMatrix array, shape (n_components, n_samples) - opposite of scikit-learn
 """
 n_samples = input.shape[1]
 num_batches = n_samples // self.batch_size

 # model parameters
 self.n_visible = input.shape[0]

 # initialize weights
 self.W = cm.CUDAMatrix(0.1 * np.random.randn(self.n_visible, self.n_hidden))
 self.visible_bias = cm.CUDAMatrix(np.zeros((self.n_visible, 1)))
 self.hidden_bias = cm.CUDAMatrix(-4.*np.ones((self.n_hidden, 1)))

 # initialize weight updates
 u_W = cm.CUDAMatrix(np.zeros((self.n_visible , self.n_hidden )))
 u_visible_bias = cm.CUDAMatrix(np.zeros((self.n_visible , 1)))
 u_hidden_bias = cm.CUDAMatrix(np.zeros((self.n_hidden , 1)))

 # initialize temporary storage
 v = cm.empty((self.n_visible, self.batch_size))
 h = cm.empty((self.n_hidden , self.batch_size))
 r = cm.empty((self.n_hidden , self.batch_size))

 if verbose == 1:
 bar = pyprind.ProgBar(self.n_epochs, title=self.title)

 for epoch in range(self.n_epochs):
 start_time = time.time()
 err = []

 for batch in range(num_batches):
 # get current minibatch
 v_true = input.slice(batch*self.batch_size, (batch + 1)*self.batch_size)
 v.assign(v_true)

 # apply momentum
 u_W.mult(self.momentum)
 u_visible_bias.mult(self.momentum)
 u_hidden_bias.mult(self.momentum)

 # positive phase
 self.transform(v, h)

 u_W.add_dot(v, h.T)
 u_visible_bias.add_sums(v, axis = 1)
 u_hidden_bias.add_sums(h, axis = 1)

 # sample hiddens
 r.fill_with_rand()
 r.less_than(h, target = h)

 # negative phase CD-k
 for n in xrange(self.k):
 self.reverse_transform(h, v)
 self.transform(v, h)

 u_W.subtract_dot(v, h.T)
 u_visible_bias.add_sums(v , axis = 1, mult = -1.)
 u_hidden_bias.add_sums(h , axis = 1, mult = -1.)

 # update weights
 self.W.add_mult(u_W, self.learning_rate/self.batch_size)
 self.visible_bias.add_mult(u_visible_bias , self.learning_rate/self.batch_size)
 self.hidden_bias.add_mult(u_hidden_bias , self.learning_rate/self.batch_size)

 # calculate reconstruction error
 v.subtract(v_true)
 err.append(v.euclid_norm()**2 / (self.n_visible * self.batch_size))

 if verbose == 1:
 bar.update()
 elif verbose > 1:
 print("Epoch: %i, MSE: %.6f, Time: %.6f s" % (epoch+1, np.mean(err), (time.time() - start_time)))

 # frees memory
 u_W.free_device_memory()
 u_visible_bias.free_device_memory()
 u_hidden_bias.free_device_memory()
 v.free_device_memory()
 h.free_device_memory()
 r.free_device_memory()

class DeepBeliefNetwork(object):

 def __init__(self, layers):
 self.layers = layers

 def fit(self, input):
 """
 Train each layer of the network
 Parameters
 ----------
 input: A CUDAMatrix shaped as (n_features, n_samples)
 """
 n_samples = input.shape[1]

 for n, layer in enumerate(self.layers):
 layer.fit(input)

 if n+1 < len(self.layers):
 h = cm.empty((layer.n_hidden, n_samples))
 layer.transform(input, h)

 if n > 0:
 input.free_device_memory()

 input = h

 if len(self.layers) > 1:
 input.free_device_memory()

 def transform(self, input):
 """
 Transform the input through each layer
 Parameters
 ----------
 input: A CUDAMatrix shaped as the first layer
 Return
 ------
 A newly allocated CUDAMatrix with the shape of the last layer.
 """
 n_samples = input.shape[1]
 for n, layer in enumerate(layers):
 h = cm.empty((layer.n_hidden, n_samples))
 layer.transform(input, h)

 if n > 0:
 input.free_device_memory()

 input = h

 return input

 def reverse_transform(self, h):
 """
 Reverse transform from last to first layer
 Parameters
 ----------
 h: A CUDAMatrix shaped as the last layer
 Return
 ------
 A new CUDAMatrix with the shape of the first layer
 """
 for n, layer in enumerate(reversed(self.layers)):
 v = cm.empty(layer.visible_bias.shape)
 layer.reverse_transform(h, v)
 if n > 0:
 h.free_device_memory()
 h = v

 return v

 def dream(self, k=10):
 """
 Generate a pattern from this network.
 Return
 ------
 A new CUDAMatrix with the shape of the first layer
 """
 last_layer = self.layers[-1]

 v = cm.empty(last_layer.visible_bias.shape)
 h = cm.empty(last_layer.hidden_bias.shape)

 v_mean = cm.empty(last_layer.visible_bias.shape)
 h_mean = cm.empty(last_layer.hidden_bias.shape)

 h.fill_with_rand()
 for _ in xrange(k):
 last_layer.sample_visible(h, v_mean, v)
 last_layer.sample_hidden(v, h_mean, h)

 v.free_device_memory()
 v_mean.free_device_memory()
 h_mean.free_device_memory()

 return self.reverse_transform(h)
[/code]

 

Using Cuda as above enables dramatic increases in computing performance by harnessing the power of the graphics processing unit (GPU). With millions of CUDA-enabled GPUs sold to date, software developers, scientists and researchers are finding broad-ranging uses for GPU computing with CUDA.

Performance matters especially when processing large amounts of data ,withthe big data boom ,more and more people are preferring GPU computation .

Here’s how you can implement GPU processing on the DBN for MNIST.

#1 Initialize :

from __future__ import division

import numpy as np
from matplotlib import pyplot as plt
from rbm_cuda import RestrictedBoltzmanMachine, DeepBeliefNetwork
import cudamat as cm

%matplotlib inline

# Initialize CUDA
cm.cublas_init()
cm.CUDAMatrix.init_random(1)

#2 Load training data:

X = np.load("input/mnist.npy")

# Load data into GPU (it needs to be (n_features, n_samples) shape, so we save the transposition)
Xc = cm.CUDAMatrix(X.T)

#3 Create a DBN with layers of RBM ( in my example 6 RBM’s):

dbn = DeepBeliefNetwork([ RestrictedBoltzmanMachine(512, title="layer-1", n_epochs=50, batch_size=1000, momentum=.1),
                          RestrictedBoltzmanMachine(512, title="layer-2", n_epochs=50, batch_size=1000, momentum=.3),
                          RestrictedBoltzmanMachine(128, title="layer-3", n_epochs=50, batch_size=1000, momentum=.4),
                          RestrictedBoltzmanMachine(128, title="layer-4", n_epochs=50, batch_size=1000, momentum=.5),
                          RestrictedBoltzmanMachine(64 , title="layer-5", n_epochs=50, batch_size=1000, momentum=.6),
                          RestrictedBoltzmanMachine(64 , title="layer-6", n_epochs=50, batch_size=1000, momentum=.8),
                        ])

 

#4 Train the DBN with data :

dbn.fit(Xc)
layer-1
0%                          100%
[##############################] | ETA[sec]: 0.000 
Total time elapsed: 9.801 sec
layer-2
0%                          100%
[##############################] | ETA[sec]: 0.000 
Total time elapsed: 6.510 sec
layer-3
0%                          100%
[##############################] | ETA[sec]: 0.000 
Total time elapsed: 2.803 sec
layer-4
0%                          100%
[##############################] | ETA[sec]: 0.000 
Total time elapsed: 1.860 sec
layer-5
0%                          100%
[##############################] | ETA[sec]: 0.000 
Total time elapsed: 1.607 sec
layer-6
0%                          100%
[##############################] | ETA[sec]: 0.000 
Total time elapsed: 1.418 sec

 

#5 Plot the generated data from the network :

plt.figure(figsize=(12.2, 12))
for i in xrange(100):
    plt.subplot(10, 10, i + 1)
    plt.imshow(dbn.dream(k=50).asarray().reshape((28, 28)), cmap=plt.cm.gray_r,
               interpolation='nearest')
    plt.xticks(())
    plt.yticks(())
plt.subplots_adjust(0.08, 0.02, 0.92, 0.85, 0.08, 0.23)
plt.show()

 

download.png

#6 shut down cuda :

# shutdown CUDA
cm.cublas_shutdown()

 

Hope this post helps someone , feel free to experiment it yourself and let me know if you come across any errors or have any questions

CARPE DIEM

 

You can follow me on GITHUB !!!

 

 

 

 

 

 

Let’s talk python.

Python is one of a handful of modern programming languages gaining a lot of traction in the development community. It was created by Guido von Rossum in 1990, named after – you guessed it – the comedy show, “Monty Python’s Flying Circus”. Like Java, once written, programs can be run on any operating system.

Linux-based operating systems such as Ubuntu are Python’s home in many ways. The Ubuntu marketplace is written in Python, and many Linux apps usually have a Python code base.

Setting up python(Linux)python-programming.png

I’m going to briefly give you step by step instructions on how
to set up a Python development environment .

Follow these simple steps :

Ubuntu 14.04

Ubuntu makes starting easy, as it comes with a command line version pre-installed. In fact, the Ubuntu community develops many of its scripts and tools under Python. You can begin the process with either the command line version or the graphical Interactive Development Environment (IDLE).

checking the version ! (command line)

We need to first check if python is already installed in your workstation. Although Ubuntu 14.04 comes with both Python 2 and Python 3 installed.Typing python at the shell prompt will launch Python 2. What we must do is use the command python3 for Python 3.But first we check the version using .

$ python -V

Open a terminal window and type ‘python’ (without the quotes). This opens python in interactive mode.

While this mode is good for initial learning, you may prefer to use a text editor (like Gedit, Vim or Emacs) to write your code. As long as you save it with the .py extension, it can be executed in the terminal window.

Screenshot from 2016-08-11 02:43:33.png

Python Programming with IDLE

Writing long programs from the command line is a tedious task. In this case, you can try IDLE. Open a terminal window and type: ‘idle’. (without the quotes). You’ll see the python graphical shell loaded.In case you do not have idle installed you can do the following

Screenshot from 2016-08-11 02:54:49.png

once idle is installed Open a terminal window and type: ‘idle’. (without the quotes). You’ll see the python graphical shell loaded.

To write a Python script, click on File > New Window.It will open an editor where you can type your script. Save the file, with a .py extension, then click on Run > Run Module from the menu to run the program.

Screenshot from 2016-08-11 03:02:31.png

Note: Many programming languages typically ignore whitespace – that is the spacing in your code. But in Python, improper use of spacing can generate syntax errors. As an organization nut of sorts, I can appreciate the simplicity and readability of properly aligned code, but for some this takes a bit of getting used to.

in case python 3 isn’t present you will see an error log saying this program is not installed or some message of that sort depending on the LINUX distro.

Now you will need to do the following :

Install python3

$ sudo apt-get install python3 python3-dev

 

The easy_install and pip Python package managers are commonly used to install and update our packages.Install both of these tools using the following commands:

sudo apt-get install python-setuptools
sudo apt-get install python-pip

A full fledged python IDE=Eclipse + PyDev.

Before installing Eclipse with PyDev, download java7 and jdk. open the terminal and type the following :

sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-java7-installer

Once this is done, you can test the installation by typing Java –version in the terminal. This command will return the installed version of Java.

Final step:

To install eclipse:

sudo apt-get install eclipse

After installing run eclipse, Help -> Install New Software
Click Add.

Add pydev location as shown below. Click OK.

Click on pydev. Click Next -> Click Install.

Go to Window -> Preferences -> PyDev -> interpreter. Click New -> add the path to python Interpreter. Click Ok.

Voila ! You have now successfully setup python development environment.

Now that you have Python up and running, you’ll undoubtedly want to move onto a bit more complex programs. Here are some of the available resources on the web that i found were useful to get started.One of the best resources is the main Python website.

  • www.learnpython.org
  • www.codecademy.com
  • quora thread
  • Python Essential Reference on Amazon

    Conclusion

    I hope you enjoyed this brief glimpse into Python. This is a language supported by a wide community and used by Google, NASA, Ubuntu and others. Give it a try for your next development project. Happy coding!

 

PS:Instructions assume that you are connected to the Internet.

 

 

 

HELLO , “HELLO WORLD ! “

 

CHAPTER 1:

The Logo –

It was the summer of 1999. I was 5 years old when I first laid eyes on a computer. It was the Golden Age of Microsoft Windows. I watched, mesmerized, as the Windows 95 logo slowly appeared on the screen of a CRT monitor. The technician had just finished setting it up and was now walking my mother through the details of regular use. My mom had just begun to pursue her Master of Computer Applications(MCA) degree and needed the PC for her course work. She was as eager and curious as I was. It was the first time our family had a personal computer. We had holidays for the summer and my cousins had come over for lunch. At once, we began exploring this newfound marvel; I still remember every little detail of that first exploration. Alas, it was a short “expedition”. My mom walked in, livid. She didn’t want us anywhere near the computer! But I was still eager to explore.

CHAPTER 2:

SkiFree –667394

As the holidays passed, me and Akka(as I like to call my sister) would use “MS Paint” on the computer to try and draw random stuff we saw everyday. We weren’t very good (chuckles). I can still see those lopsided houses and impossibly shaped teapots. Akka would draw these and let me fill it in with colors. It was a day like any other. We found “Paint” at lightning speed. Computer time was precious (and limited) to us. But this time, the “Games” section caught our eyes. This was to be a secret between me and Akka as we were sure Mom wouldn’t approve. We quickly clicked on “Games” and a list of games came up. I could see this pixelated image of a person skiing and curiosity got the best of us. We started up the game and slowly learnt how to play it. Soon we got scores in the 25’s and 30’s. That “New High Score Achieved” dialog on the game screen kept us going (as if we had accomplished something). We were hooked from the get go. I remember it like it was yesterday. I was fascinated by that game; the fact that I could control something on-screen with the stroke of a key was just astounding to me.

Chapter 3:

Windows 98 and more –

Growing up, my typical computer usage was gaming. Some of the most interesting games I would play all day were Prince, Dave, Pinball and many more that I can’t seem to remember. An uncle of mine had come to visit. He had with him a CD with “Windows 98” began installing it on our computer. Like always, me and Akka sat next to him and were just clueless. But when he restarted the computer all my games were gone!! He tried explaining to us how it was better than what we were using before; my sister nodded like she understood but I was still a bit confused. He introduced me to these new words “update” and “upgrade”. Fast forward to high school. Our school introduced a subject called “Computers”. That was when things started to make sense to me. Power On/Off, shut down, restart, “My Computer”, files, the “Recycle Bin” and so much more. Towards the end of my 10th grade I became more aware about software, the Open Source initiative, Operating Systems, and more importantly, how I could get a CD delivered to my house address for free! Remember when Ubuntu used to do this? Next, I began to study basic HTML and how to design web pages, the basics of the C language, using Microsoft Office. But more importantly, I was aching to know how they made SkiFree! During my junior and senior years(11th&12th grade), I learnt to program in C/C++. I tried to use Open Source alternatives whenever I could and was always interested in picking up a new language. My projects during that time were mainly in C++.

Chapter 4 :

Present Day –

Today, I am nearing the end of my Undergraduate course in Computer Science and Engineering. Ever since the first time my passion and fascination for understanding computers and their behavior keeps growing. The love for understanding computers and their behaviour is what got me interested in Machine learning and Artificial Intelligence . I have implemented a number of projects that i will be sharing with you guys so we can learn together. I have picked up an interest in Python in the recent years and will try to use it as much as I can. Other platforms I use include Java , MATLAB/Octave and C++.

PS: I try to give a detailed insight of the most favorite episodes of my life as those were the times I thought had an impact of my understandings of the subject I love so.
PPS : I shall keep updating this post as life goes on.