Machine learning of Big Data, Internet of Things, Swarm robotics, and 3D-cameras.
I am especially excited about reducing the gap between theoretical and practical algorithms, using my experience in the industry and academy.
Core-sets: Semantic compression of data sets into small sets that provably approximate the original data for a given problem. Using merge-reduce (e.g. Spark) the small sets can then be used for solving hard machine learning problems in parallel (on the cloud/network), real-time and on Big streaming data.
Videos from the Machine Learning Summer School 2014 at CMU
Dan Feldman is a faculty member and the head of the new Robotics & Big Data Labs in the University of Haifa, after returning from a 3 years post-doc at the robotics lab of MIT. During his PhD in the University of Tel-Aviv he developed data reduction techniques known as core-sets, based on computational geometry. Since his post-docs at Caltech and MIT, Dan's coresets are applied for main problems in Machine Learning, Big Data, computer vision, EEG and robotics. His group in Haifa continues to design and implement core-sets with provable guarantees for such real-time systems.