Blog


Installing TensorFlow on 10 with GPU

posted Feb 3, 2018, 8:12 PM by MUHAMMAD MUN`IM AHMAD ZABIDI   [ updated Feb 4, 2018, 12:26 AM ]

This post supersedes my earlier post on simply installing CUDA.

This is my version, for my Windows 10 machine, with GPU.

  • Install driver R390:

http://us.download.nvidia.com/Windows/390.77/390.77-desktop-win10-64bit-international-whql.exe

  • Install CUDA Toolkit 8.0:
  • Double click on the installer until you get this dialog:


  • After installation, go to this directory to check existence of sample programs
C:\ProgramData\NVIDIA Corporation\CUDA Samples\v8.0\bin\win64\Release

  • cuDNN v6.0
To cuDNN, joing the NVIDiA developer network at https://developer.nvidia.com/cudnn. Then choose cuDNN v6.0 from the list

The Tensorflow documentation says "the cuDNN version must match exactly: TensorFlow will not load if it cannot find cuDNN64_6.dll."

References:
  • http://docs.nvidia.com/cuda/cuda-installation-guide-microsoft-windows/index.html
  • https://www.tensorflow.org/install/install_windows#requirements_to_run_tensorflow_with_gpu_support
  • http://docs.nvidia.com/deeplearning/sdk/cudnn-install/index.html

Programmable CPLD Timer

posted Jan 30, 2018, 8:13 AM by MUHAMMAD MUN`IM AHMAD ZABIDI   [ updated Jan 30, 2018, 4:47 PM ]

We describe a programmble timer which produces a pulse with a length determined by the binary value at input x.


The circuit contains a 1 Hz timing generator, a counter, a comparator and an SR latch.


A 50 MHz system clock is assumed.

The Trig signal initiates timer operation. It is connected to 3 points in the module. At the timing reference, Trig loads 50 million into the 26-bit accumulator. At the counter, Trig clears the counter to 0. At the output SR latch, Trig sets Out to high.

The timing reference produces a pulse one clock cycle wide every second. The pulse, labeled Z, enables the counter so that the counter increases by 1 every second.

The comparator outputs a reset pulse to the SR latch when counter value equals X.

If you just want a 1 second pulse, connect Z to the set input of the latch (and skip the counter and comparator).

15 Deep Learning Open Courses & Tutorials

posted Jan 29, 2018, 8:17 PM by MUHAMMAD MUN`IM AHMAD ZABIDI   [ updated Jan 29, 2018, 8:30 PM ]

Credit to [https://sky2learn.com/deep-learning-reinforcement-learning-online-courses-and-tutorials-theory-and-applications.html]


Deep learning and deep reinforcement learning have recently been successfully applied in a wide range of real-world problems. Here are 15 online courses and tutorials in deep learning and deep reinforcement learning, and applications in natural language processing (NLP), computer vision, and control systems. 12 of them include video lectures. The courses cover the fundamentals of neural networks, convolutional neural networks, recurrent networks and variants, difficulties in training deep networks, unsupervised learning of representations, deep belief networks, deep Boltzmann machines, deep Q-learning, value function estimation and optimization, and Monte Carlo tree search. Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville is a great open access textbook used by many of the courses, and Daivd Silver provides a good series of 10 video lectures in reinfrocement learning. For machine learning reviews, here are 15 online courses and tutorials for machine learning.

Deep Learning Specialization

Andrew Ng. Founder of Coursera. Coursera. 2017

This is a series of five sub-courses, teaching the fundamental of deep learning as well as how to apply deep learning in various areas, for example, healthcare, autonomous driving, sign language reading, music generation, and natural language processing. You will gain hands-on experiences in using TensorFlow to solve real problems. The course has video lectures.

Deep Learning

Ruslan Salakhutdinov. Carnegie Mellon University, Director of AI Research at Apple. 2017.

This course starts with the basic topics ranging from feedforward neural nets, backpropagation, convolutional models. Then the essentials of deep learning are introduced, including directed and undirected graphical models, independent component analysis (ICA), sparse coding, autoencoders, restricted Boltzmann machine (RBMs), Monte Carlo Methods, deep belief networks, deep Boltzmann Machines, and Helmholtz Machines. Additional topics include regularization and optimization in deep net, sequence modeling, and deep reinforcement learning.

Theories of Deep Learning

David Donoho, Hatef Monajemi, and Vardan Papyan. Stanford University. 2017.

This course discusses theoretical aspects of deep learning. There are 8 invited guest lectures from the leading scholars in the deep learning, computational neuroscience, and statistics. You will have a chance to explore their diverse and interdisciplinary viewpoints for the current research trends in deep learning. The course has video lectures.

Deep Learning

Yoshua Bengio. Université de Montréal, Head of the Montreal Institute for Learning Algorithms (MILA). 2016

The course reviews the basics of neural networks, including perceptrons, backpropagation and gradient optimization. It then covers advanced subjects in neural networks, probabilistic graphical models, deep networks, and representation learning.

Deep Learning and Reinforcement Learning Summer School 2017

This summer school organized by Yoshua Bengio and his colleagues. Université de Montréal, Head of the Montreal Institute for Learning Algorithms (MILA). 2016

The summer school covers two tracks, deep learning and reinforcement learning. The invited speakers are leading scholars and researchers in these fields. They cover the fundamental knowledge of deep learning and reinforcement learning. In addition, both tracks also discuss the most recent research trends and discoveries in these areas. The summer school includes video lectures.

Deep Reinforcement Learning

Sergey Levine. University of California at Berkeley. 2017.

The course covers the basics of reinforcement learning: Q-learning and policy gradients. It also includes the advanced model learning and prediction, distillation, reward learning, as well as advanced deep RL, for example, trust region policy gradients, actor-critic methods, exploration. This course has video lectures.

Deep Learning

Vincent Vanhoucke. Principal Scientist at Google and Director of the Brain Robotics Research team, and Arpan Chakraborty. Google via Udacity. 2017.

The course covers deep learning, deep neural networks, convolutional neural networks and deep models for text and sequences. The assignments will require you to use TensorFlow for practical experiences. The course has video lectures.

Deep Learning for Natural Language Processing

Phil Blunsom at University of Oxford, Chris Dyer at Carnegie Mellon University, Edward Grefenstette at DeepMind, Karl Moritz Hermann at DeepMind, Andrew Senior, Wang Ling at DeepMind, and Jeremy Appleyard at Nvidia. University of Oxford. 2017.

The course covers the fundamentals of deep learning and how it applies in natural language processing. You will learn how to define mathematical problems in this field, as well as to get hands-on programming experience in CPU and GPU. This course includes video lectures.

Convolutional Neural Networks for Visual Recognition

Fei-Fei Li. Stanford University, Director of Stanford AI Lab and Chief Scientist AI/ML of Google Cloud. 2017.

The course will cover the basics of deep learning, and how to apply deep learning techniques in computer vision. Students will get hands-on experience in how to train and fine-tune neural networks through the assignments and the final project. Python will be mainly used in the course. This course includes video lectures.

Deep Reinforcement Learning and Control

Ruslan Salakhutdinovat Carnegie Mellon University, Director of AI Research at Apple and Katerina Fragkiadaki at Carnegie Mellon University. Carnegie Mellon University. 2017.

This course topics cover the fundamentals of deep learning, reinforcement learning, Markov decision process (MDPs), Partially observable Markov decision process (POMDPs), Temporal difference learning, Q learning, deep learning, deep Q learning. Advanced topics include optimal control, trajectory optimization, Hierarchical RL and Transfer Learning.

Tutorial: Deep Reinforcement Learning, RLDM 2015

David Silver at Google DeepMind. 2nd Multidisciplinary Conference on Reinforcement Learning and Decision Making (RLDM), Edmonton 2015.

In this 1.5-hour video tutorial, you will understand the fundamentals of deep learning, reinforcement learning, and how to combine DL and RL with various approaches: i.e., deep value functions, deep policies, and deep models. Besides, you will also learn from the leading expert that how to handle the divergence issues in these methods.

Tutorial: Deep Learning, Simons Institution, 2017

Ruslan Salakhutdinov. Carnegie Mellon University, Director of AI Research at Apple. 2017.

This tutorial consists of four 1-hour-long video lectures, giving students a quick and in-depth introduction to deep learning. It covers from supervised learning to unsupervised learning, as well as model evaluation and open research questions in deep learning.

Tutorial: Deep Reinforcement Learning, Simons Institution 2017

Pieter Abbeel. University of California at Berkeley. 2017.

This one-hour long tutorial on deep reinforcement learning, with a video lecture. You will have a glance of how deep reinforcement learning works.

Youtube: Deep Reinforcement Learning, MLSS 2016

John Schulman, Research scientist at OpenAI. Machine Learning Summer Schools - MLSS. 2016.

This tutorial includes four one-hour long video lectures with the practice in a lab problem.

Youtube Lecture Collection | Natural Language Processing with Deep Learning (Winter 2017)

Christopher Manning at Stanford University and Richard Socher,Chief scientist at Salesforce. Stanford University. 2017.

This is the archived version of "CS224n: Natural Language Processing with Deep Learning", taught in Winter 2017 at Stanford, with eighteen video lectures. There is also an ongoing version of the course, starting in 2018. It discusses how to apply deep learning in natural language processing, as well as issues in NLP and limits of deep learning for NLP.

Using Youtoube-dl

posted Jan 29, 2018, 8:13 PM by MUHAMMAD MUN`IM AHMAD ZABIDI   [ updated Jan 30, 2018, 8:48 AM ]

Once Youtube-dl installed, you can find out all available commands this way:

$ youtube-dl --h

Youtube-dl supports many many formats. To list the formats avaiable for a particular video (in this case the Baby Shark song), do this:

$ youtube-dl -F https://www.youtube.com/watch?v=XqZsoesa55w

You should see the format list as follows:

[youtube] XqZsoesa55w: Downloading webpage
[youtube] XqZsoesa55w: Downloading video info webpage
[youtube] XqZsoesa55w: Extracting video information
WARNING: unable to extract uploader nickname
[info] Available formats for XqZsoesa55w:
format code  extension  resolution note
249          webm       audio only DASH audio   61k , opus @ 50k, 737.69KiB
250          webm       audio only DASH audio   78k , opus @ 70k, 962.10KiB
140          m4a        audio only DASH audio  127k , m4a_dash container, mp4a.40.2@128k, 2.06MiB
171          webm       audio only DASH audio  129k , vorbis@128k, 1.71MiB
251          webm       audio only DASH audio  149k , opus @160k, 1.82MiB
278          webm       256x144    144p  108k , webm container, vp9, 30fps, video only, 1.61MiB
160          mp4        256x144    144p  113k , avc1.4d400c, 30fps, video only, 1.68MiB
133          mp4        426x240    240p  245k , avc1.4d4015, 30fps, video only, 2.50MiB
242          webm       426x240    240p  270k , vp9, 30fps, video only, 3.66MiB
243          webm       640x360    360p  513k , vp9, 30fps, video only, 6.81MiB
134          mp4        640x360    360p  569k , avc1.4d401e, 30fps, video only, 4.66MiB
244          webm       854x480    480p  910k , vp9, 30fps, video only, 11.84MiB
135          mp4        854x480    480p 1159k , avc1.4d401f, 30fps, video only, 9.39MiB
247          webm       1280x720   720p 1784k , vp9, 30fps, video only, 22.52MiB
136          mp4        1280x720   720p 2153k , avc1.4d401f, 30fps, video only, 15.62MiB
248          webm       1920x1080  1080p 3118k , vp9, 30fps, video only, 39.37MiB
137          mp4        1920x1080  1080p 3680k , avc1.640028, 30fps, video only, 23.62MiB
17           3gp        176x144    small , mp4v.20.3, mp4a.40.2@ 24k
36           3gp        320x180    small , mp4v.20.3, mp4a.40.2
43           webm       640x360    medium , vp8.0, vorbis@128k
18           mp4        640x360    medium , avc1.42001E, mp4a.40.2@ 96k
22           mp4        1280x720   hd720 , avc1.64001F, mp4a.40.2@192k (best)


Next, choose any format you want to download with the flag -f as shown below:

$ youtube-dl -f 18 https://www.youtube.com/watch?v=XqZsoesa55w

If you want to download the video in mp3 audio format, then give this command:

$ youtube-dl -x --audio-format mp3 https://www.youtube.com/watch?v=XqZsoesa55w

Installing Youtube-dl

posted Jan 29, 2018, 8:06 PM by MUHAMMAD MUN`IM AHMAD ZABIDI

Youtube-dl: It's the hacker's YouTube downloader.

Why use easy-to-use GUI based downloader when you can do it more efficiently using the command line ;)

Youtube-dl not available in the Ubuntu repo. To download from the official websiet, you need to download it using the curl command.

First, install curl.

$ sudo apt-get install curl -y

Then, download the binary:

$ curl -L https://yt-dl.org/latest/youtube-dl -o /usr/bin/youtube-dl

Next, change the permission of the binary:

$ sudo chmod 755 /usr/bin/youtube-dl

Once installed, you can proceed to the next step.


Which Nvidia Card on Your Ubuntu?

posted Jan 3, 2018, 8:30 PM by MUHAMMAD MUN`IM AHMAD ZABIDI

To find out if your GPU is installed properly and working run the following command:

munim@dnn:~$ sudo lspci -nnk | grep -i nvidia

01:00.0 VGA compatible controller [0300]: NVIDIA Corporation Device [10de:1c81] (rev a1)
Kernel driver in use: nvidia
Kernel modules: nvidiafb, nouveau, nvidia_375_drm, nvidia_375
01:00.1 Audio device [0403]: NVIDIA Corporation Device [10de:0fb9] (rev a1)

To confirm that the driver was installed correctly and that your GPU is being recognized, run nvidia-smi. This command is also useful if you want to check performance metrics of the GPU.

munim@dnn:~$ sudo nvidia-smi
Thu Jan  4 19:23:37 2018       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 375.66                 Driver Version: 375.66                    |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  GeForce GTX 1050    Off  | 0000:01:00.0      On |                  N/A |
|  0%   46C    P8    35W /  70W |    399MiB /  1998MiB |      0%      Default |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                       GPU Memory |
|  GPU       PID  Type  Process name                               Usage      |
|=============================================================================|
|    0      1213    G   /usr/lib/xorg/Xorg                             207MiB |
|    0      3181    G   compiz                                         135MiB |
|    0      6461    G   ...el-token=D58F2E98F72FF8E38A4F40A622AEE0C3    54MiB |
+-----------------------------------------------------------------------------+


Tips to learn a new language

posted Jan 1, 2018, 5:26 PM by MUHAMMAD MUN`IM AHMAD ZABIDI   [ updated Jan 1, 2018, 6:18 PM ]

Found this blog here http://darasteine.tumblr.com/post/146643431652/tips-to-learn-a-new-language

  • The 75 most common words make up 40% of occurrences
  • The 200 most common words make up 50% of occurrences
  • The 524 most common words make up 60% of occurrences
  • The 1257 most common words make up 70% of occurrences
  • The 2925 most common words make up 80% of occurrences
  • The 7444 most common words make up 90% of occurrences
  • The 13374 most common words make up 95% of occurrences
  • The 25508 most common words make up 99% of occurrences

Sources: 5 Steps to Speak a New Language  (PDF) by Hung Quang Pham

This article has an excellent summary on how to rapidly learn a new language within 90 days.

 We can begin with studying the first 600 words. Of course chucking is an effective way to memorize words readily. Here’s a list to translate into the language you desire to learn that I grabbed from here! :) 

EXPRESSIONS OF POLITENESS (about 50 expressions)       

  • Yes’ and ‘no’: yes, no, absolutely, no way, exactly.    
  • Question words: when? where? how? how much? how many? why? what? who? which? whose?    
  • Apologizing: excuse me, sorry to interrupt, well now, I’m afraid so, I’m afraid not.    
  • Meeting and parting: good morning, good afternoon, good evening, hello, goodbye, cheers, see you later, pleased to meet you, nice to have met.    
  • Interjections: please, thank you, don’t mention it, sorry, it’ll be done, I agree, congratulations, thank heavens, nonsense.    

NOUNS (about 120 words)

  • Time: morning, afternoon, evening, night; Sunday, Monday, Tuesday, Wednesday, Thursday, Friday, Saturday; spring, summer, autumn, winter; time, occasion, minute, half-hour, hour, day, week, month, year.   
  • People: family, relative, mother, father, son, daughter, sister, brother, husband, wife; colleague, friend, boyfriend, girlfriend; people, person, human being, man, woman, lady, gentleman, boy, girl, child.   
  • Objects: address, bag, book, car, clothes, key, letter (=to post), light (=lamp), money, name, newspaper, pen, pencil, picture, suitcase, thing, ticket.    
  • Places: place, world, country, town, street, road, school, shop, house, apartment, room, ground; Britain, name of the foreign country, British town-names, foreign town-names.    
  • Abstract: accident, beginning, change, color, damage, fun, half, help, joke, journey, language, English, name of the foreign language, letter (of alphabet), life, love, mistake, news, page, pain, part, question, reason, sort, surprise, way (=method), weather, work.    
  • Other: hand, foot, head, eye, mouth, voice; the left, the right; the top, the bottom, the side; air, water, sun, bread, food, paper, noise.    

PREPOSITIONS (about 40 words)    

  • General: of, to, at, for, from, in, on.    
  • Logical: about, according-to, except, like, against, with, without, by, despite, instead of.    
  • Space: into, out of, outside, towards, away from, behind, in front of, beside, next to, between, above, on top of, below, under, underneath, near to, a long way from, through.    
  • Time: after, ago, before, during, since, until.    

DETERMINERS (about 80 words)   

  • Articles and numbers: a, the; nos. 0–20; nos. 30–100; nos. 200–1000; last, next, 1st–12th.    
  • Demonstrative: this, that.    
  • Possessive: my, your, his, her, its, our, their.    
  • Quantifiers: all, some, no, any, many, much, more, less, a few, several, whole, a little, a lot of.    
  • Comparators: both, neither, each, every, other, another, same, different, such.    

ADJECTIVES (about 80 words)    

  • Color: black, blue, green, red, white, yellow.    
  • Evaluative: bad, good, terrible; important, urgent, necessary; possible, impossible; right, wrong, true.    
  • General: big, little, small, heavy; high, low; hot, cold, warm; easy, difficult; cheap, expensive; clean, dirty; beautiful, funny (=comical), funny (=odd), usual, common (=shared), nice, pretty, wonderful; boring, interesting, dangerous, safe; short, tall, long; new, old; calm, clear, dry; fast, slow; finished, free, full, light (=not dark), open, quiet, ready, strong.    
  • Personal: afraid, alone, angry, certain, cheerful, dead, famous, glad, happy, ill, kind, married, pleased, sorry, stupid, surprised, tired, well, worried, young.    

VERBS (about 100 words)    

  • arrive, ask, be, be able to, become, begin, believe, borrow, bring, buy, can, change, check, collect, come, continue, cry, do, drop, eat, fall, feel, find, finish, forget, give, going to, have, have to, hear, help, hold, hope, hurt (oneself), hurt (someone else), keep, know, laugh, learn, leave, lend, let (=allow), lie down, like, listen, live (=be alive), live (=reside), look (at), look for, lose, love, make, may (=permission), may (=possibility), mean, meet, must, need, obtain, open, ought to, pay, play, put, read, remember, say, see, sell, send, should, show, shut, sing, sleep, speak, stand, stay, stop, suggest, take, talk, teach, think, travel, try, understand, use, used to, wait for, walk, want, watch, will, work (=operate), work (=toil), worry, would, write.    

PRONOUNS (about 40 words) 

  • Personal: I, you, he, she, it, we, they, one; myself, yourself, himself, herself, itself, ourselves, yourselves, themselves.    
  • Possessive: mine, yours, his, hers, its, ours, theirs.    
  • Demonstrative: this, that.    
  • Universal: everyone, everybody, everything, each, both, all, one, another.    
  • Indefinite: someone, somebody, something, some, a few, a little, more, less; anyone, anybody, anything, any, either, much, many.    
  • Negative: no-one, nobody, nothing, none, neither.    

ADVERBS (about 60 words) 

  • Place: here, there, above, over, below, in front, behind, nearby, a long way away, inside, outside, to the right, to the left, somewhere, anywhere, everywhere, nowhere, home, upstairs, downstairs.    
  • Time: now, soon, immediately, quickly, finally, again, once, for a long time, today, generally, sometimes, always, often, before, after, early, late, never, not yet, still, already, then (=at that time), then (=next), yesterday, tomorrow, tonight.    
  • Quantifiers: a little, about (=approximately), almost, at least, completely, very, enough, exactly, just, not, too much, more, less.    
  • Manner: also, especially, gradually, of course, only, otherwise, perhaps, probably, quite, so, then (=therefore), too (=also), unfortunately, very much, well.    

CONJUNCTIONS (about 30 words) 

  • Coordinating: and, but, or; as, than, like.    
  • Time & Place: when, while, before, after, since (=time), until; where.    
  • Manner & Logic: how, why, because, since (=because), although, if; what, who, whom, whose, which, that.    

Deep Learning for Dummies

posted Dec 25, 2017, 4:06 PM by MUHAMMAD MUN`IM AHMAD ZABIDI   [ updated Dec 25, 2017, 4:10 PM ]

Learning Deep Learning

posted Dec 21, 2017, 10:23 PM by MUHAMMAD MUN`IM AHMAD ZABIDI   [ updated Dec 21, 2017, 10:24 PM ]

Jumpstart GNU for ARM Cortex-M0

posted Dec 8, 2017, 6:41 PM by MUHAMMAD MUN`IM AHMAD ZABIDI   [ updated Dec 8, 2017, 7:13 PM ]

On Windows, click the first download link for this file (link is current as of 9 dec 2017):

File: gcc-arm-none-eabi-6-2017-q2-update-win32.exe (82.56 MB)

Once downloaded, double-click the installer.

The tools will be installed here. You may want to update you enviroment variables...

C:\Program Files (x86)\GNU Tools ARM Embedded\6 2017-q2-update\bin\

To test it, make a short C program.

int main()
{
  int a = 0x55;
  int b;

  b = a ^ 0xff00;
  return 0;
}


Compile it (my code is at ~\cortex directory):

C:\Users\Mun3im\cortex>arm-none-eabi-gcc  --specs=nosys.specs -mcpu=cortex-m0 -S test.c

The assembler output is test.s. The assembly code corresponding the C code in is in light green. The rest of the code are just scaffolding.

C:\Users\Mun3im\cortex>more test.s
        .cpu cortex-m0
        .eabi_attribute 20, 1
        .eabi_attribute 21, 1
        .eabi_attribute 23, 3
        .eabi_attribute 24, 1
        .eabi_attribute 25, 1
        .eabi_attribute 26, 1
        .eabi_attribute 30, 6
        .eabi_attribute 34, 0
        .eabi_attribute 18, 4
        .file   "test.c"
        .text
        .align  1
        .global main
        .syntax unified
        .code   16
        .thumb_func
        .fpu softvfp
        .type   main, %function
main:
        @ args = 0, pretend = 0, frame = 8
        @ frame_needed = 1, uses_anonymous_args = 0
        push    {r7, lr}
        sub     sp, sp, #8
        add     r7, sp, #0
        movs    r3, #85
        str     r3, [r7, #4]
        ldr     r3, [r7, #4]
        movs    r2, #255
        lsls    r2, r2, #8
        eors    r3, r2
        str     r3, [r7]
        movs    r3, #0
        movs    r0, r3
        mov     sp, r7
        add     sp, sp, #8
        @ sp needed
        pop     {r7, pc}
        .size   main, .-main
        .ident  "GCC: (GNU Tools for ARM Embedded Processors 6-2017-q2-update) 6.3.1 20170620 (release) [ARM/embedded-6-branch revision 249437]"

How to decipher the code

        movs    r3, #85           @ a = 0x55;
        str     r3, [r7, #4]      // a = [r7 + 4]
        ldr     r3, [r7, #4]
        movs    r2, #255          // r2 = 0xff
        lsls    r2, r2, #8        // r2 = 0xff00
        eors    r3, r2            // r3 = a ^ 0xff00;  'b' is r3
        str     r3, [r7]          @ b = a ^ 0xff00, stored in [r7]
        movs    r3, #0
        movs    r0, r3            @ r0 <- 0 to prepare for return 0 C statement

1-10 of 26