Feeds:
Posts
Comments

Archive for March 26th, 2014

deeplearningcots

This is the claim by Nvidia CEO Jen-Hsun Huang, that 3 Nvidia Titan Z CUDA based GPU cards could offer the same performance in running deep learning neural nets as done in the Google Brian project using Intel processors.

  • 1/150 the acquisition cost reduction
  • 1/150 heat consumption

If this could be done in general for deep learning type problems, we could have many more machines to do machine learning on the explosion of data. At the same time, to use the CUDA cores, software programmers would need to learn program this hardware and / or use OpenCL. The cost savings could warrant pushing over the learning curve.

This paper is referenced: “Deep Learning with COTS HPC Systems” by A. Coates, B. Huval, T. Wang, D. Wu, A. Ng, B. Catanzaro, published on ICML 2013

Read Full Post »