MATLAB CUDA support provides the base for GPU-accelerated MATLAB operations and lets integrate existing CUDA kernels into MATLAB applications. However, as a restriction, MATLAB only supports GPUs with CUDA compute capability version 1.3 or higher, such as Tesla 10-series and 20-series GPUs. This limitation is not from a light decision; it is actually due to the double precision support and the IEEE-compliant maths implementation of the CUDA capability version 1.3. Please see this thread for more discussion.
MATLAB GPU computing capabilities include:
- Data manipulation on NVIDIA GPUs
- GPU-accelerated MATLAB operations
- Integration of CUDA kernels into MATLAB applications without low-level C or Fortran programming
- Use of multiple GPUs on the desktop (via the toolbox) and a computer cluster (via MATLAB Distributed Computing Server)
Introduction to MATLAB GPU Computing (Video)
MATLAB GPU Computing (Documentation)
At AccelerEyes, we are happy that MathWorks has finally turned the corner on accepting GPU technology, something we've been pushing them to do since 2007.
ReplyDeleteWe are also excited, because this is great news for the GPU computing ecosystem, to see one of the technical computing giants validate the approach we've been delivering with Jacket.
Finally, we are thrilled, because we are confident that anyone attempting to use GPU computing via MATLAB directly will end up becoming a Jacket programmer (for instance, try indexing into a matrix or running a convolution with R2010b). To see how Jacket stacks up against the alternative, see:
* http://www.accelereyes.com/products/compare
and
* http://forums.accelereyes.com/forums/viewtopic.php?f=7&t=1487
Jacket will always offer the best in GPU computing technology. Without Jacket's runtime technology, no MATLAB GPU computing attempt will be able to come close to competing in performance benefits on real applications. It is one thing to get a few functions to run on the GPU. But getting full applications to run fast on the GPU is an entirely different matter.
We invite you to check it out for yourself by visiting, http://www.accelereyes.com
Best,
John Melonakos
CEO
AccelerEyes
Hi John Melonakos,
ReplyDeleteFinally, I am glad that there is a competition to start in GPU Computing in MATLAB and the customers will benefit from them for sure.
GPU Computing should be free and will remain free. Still, it is in a crawling stage that can not support complex kernels to be run in GPU.
Let's wait and see what happens and who ca get these run on GPU kernels.
Best Regards...
Hey,
ReplyDeleteI'm another engineer at AccelerEyes and happened upon your blog. Cool article on the CUDA HeatTransfer port.
Speaking of GPU computing for free, we just released ArrayFire: a free numerical library for CUDA with interfaces to C/C++, Python, and Fortran. It's easier to use than Thrust, and contains hundreds of algorithms and functions. It might be of use in some of your projects down the road.
Download it free: http://arrayfire.com
Best of luck upon graduation!
James Malcolm, AccelerEyes