Short answer: Python and R for academic. Java or C++ for production. Julia has some followers, LUA has a Deep Learning Implementation. Python+CUDA+C++ is very common in deep learning.
---
Chiming in as a machine learning researcher. My experience is primarily focused on the 'deep learning' buzzword at the moment, so people still researching SVM/NN/clustering may have different experiences.
Academic machine learning is starting to shift towards Python. Most (all?) of the main deep learning packages have either Python as the core (like PyLearn2 + Theano) or a strong Python component (like Caffe or DeepLearn). There is some Java presence (like DeepLearning4j) and some C++ mixed in for the very high performance code. Usually people will write the glue code in Python and then do the heavy matrix-intensive operations using PyCuda or cuBLAS (which are, to oversimplify, Python wrappers of C++ calls to CUDA/OpenCL code).
Some of the bigger names in Deep Learning:
(C++ w/ Python & Matlab support) Caffe: https://github.com/BVLC/caffe Their talk has links to a lot of other implementations.
I would say the top languages are Python and R. Many enterprises use SAS and/or SPSS. Java is often used in production settings or when dealing with Hadoop. Other less popular but still useful languages are: Julia, Octave/Matlab, and Perl.
This is a bit off topic but I thought I'd mention, there's a new language being developed specifically for AI / Machine learning by a couple guys in LA called Premise. I saw the main developer demo it at a meetup and it looked very promising.
---
Chiming in as a machine learning researcher. My experience is primarily focused on the 'deep learning' buzzword at the moment, so people still researching SVM/NN/clustering may have different experiences.
Academic machine learning is starting to shift towards Python. Most (all?) of the main deep learning packages have either Python as the core (like PyLearn2 + Theano) or a strong Python component (like Caffe or DeepLearn). There is some Java presence (like DeepLearning4j) and some C++ mixed in for the very high performance code. Usually people will write the glue code in Python and then do the heavy matrix-intensive operations using PyCuda or cuBLAS (which are, to oversimplify, Python wrappers of C++ calls to CUDA/OpenCL code).
Some of the bigger names in Deep Learning:
(C++ w/ Python & Matlab support) Caffe: https://github.com/BVLC/caffe Their talk has links to a lot of other implementations.
(Python) PyLearn2: https://github.com/lisa-lab/pylearn2 Another popular deep-learning project.
(Java) DeepLearning 4j: https://github.com/agibsonccc/java-deeplearning
(C++ + Python) Cuda-Convnet: https://code.google.com/p/cuda-convnet/ Has C++ for the learning portions and Python glue.