Ivan Bruha
Department of Computing and Software, McMaster University
Hamilton, Ont., Canada L8S4K1

Office: ITB-219
Phone: +1.905.5259140 ext. 23439
Fax: +1.905.5240340
Email: bruha@mcmaster.ca


Education

Dipl. Ing.
1969 Czech Technical University, Faculty of Electrical Engineering, Prague (Cybernetics)
PhD.
1973 Czech Technical University, Faculty of Electrical Engineering, Prague (Artificial Intelligence)
RNDr
1974 Charles University, Faculty of Mathematics and Physics, Prague (Logic & Formal Languages)

Areas of Interest

Machine Learning, Neural Nets, Pattern Recognition; Programming Languages for AI

o Attribute-based learning from examples. I implemented a completely new extended version of the CN2 covering learning algorithm and am designing and testing various topics comparing it with other well-known ML algorithms. Particularly:
(i) Economic learning concerning attribute cost;
(ii) Unknown attribute value processing;
(iii) Rule quality (An unordered-mode classification requires a numerical evaluation to be attached to each rule. I design various formulas for the rule quality and investigate the way how these qualities can be combined in order to solve conflict situations when classifying a given unseen object.)
o Incremental learning. I am convinced that the incrementality of learning should be studied in more details. I am investigating new approaches to this topic, particularly, the ways how a learning system can forget and eventually retrieve some 'pieces' of knowledge. This incremental way of learning (which is distinctive for human learning) requires to embed powerful statistical measures and techniques into the traditional symbolic learning systems.
o Hybrid representation and systems. There exist attempts to combine various knowledge representation tools as well as machine learning paradigms. Such multistrategy or hybrid systems may solve the complexity of learning. I am particularly interesting in combining symbolic learning with neural nets. Each paradigm has its advantages and drawbacks, but only its sophisticated merging may enhance the believability of such learning systems.
o Neural Nets Applications. Besides the above hybrid systems, neural nets seem to be very useful in emulating probability distribution functions and belief functions in statistical applications.

Lifetime Publications

Workshop Postprocessing in Machine Learning and Data Mining, ACM International Conference on Knowledge Discovery in Databases (KDD-2000), Boston (2000): All publications