Officially worked too much \ deep learning


#1

I’ve been working with every second possible and have barely taken a break for months unless family or obligations intervened. I finally crashed today after working extra hard the past few days to get two projects done so I can start on two Big projects I have due mid-august. my body is not happy with me. I do take 10 minutes off of work each day to ride a bike… and I do a few push-ups here and there. i think I got enough reviews, raised my prices again.

I’m going to watch Blade Runner, since it was on in the background while I was working on an article about Deep Learning yesterday, and it captured my imagination. The so called deep learning revolution came about because researchers found that using a GPU to perform (ai-inspired) machine learning(ML) algorithms that they could complete experiments in days instead of weeks. If you enjoy the improvements of voice-activated digital assistants, google search, image recognition, etc. you can thank NVIDIA for helping out with the introduction of the GPU. It’s only been since 2012 that this technique became popular and begun to change the world around us at a rapid pace.

Today crypto-currency mining has inspired many more improvements in processing hardware. The ASIC (application specific integrated circuit) is now commonly used to mine bitcoin, and is up to 30x faster than a GPU. These ASIC chips are getting faster and faster. Their biggest manufacture, Bitmain has already produced one ASIC that is compatible with ML algorithms, and is now expanding its business to target the AI research field.

We haven’t even caught up with what was made possible by the GPU, what’s going to happen when there are massive asic machine learning farms available on the cloud for any researcher or hobbyist to experiment with?

Ok, I’m going to go watch Blade Runner 2049 and give my arms a rest.