NEC orchestrating a brighter world
NEC Laboratories Europe


 Figure_2_Blog_von_Neumann_Conditional_Divergence_Sample_video_traffic_camera_images.png blog_preview_data-science_edge_1.png

Learning to Transfer with von Neumann Conditional Divergence

Learning paradigms designed for multiple domains or tasks, such as multitask learning, continual learning and domain adaptation, aim to reduce the large amount of energy and manual labor needed to retrain machine learning models. In this work, we introduce a domain adaptation approach that exploits learned features from relevant source tasks to reduce the data required for learning the new target task.

 Uncertainty_Quantification_blog_gcn_ood_citeseer_recreated_V3.png blog_preview_data-science_edge_2.png

Uncertainty Quantification in Node Classification

Modern neural networks are widely applied in a variety of learning tasks due to their exceptional performance, but fail to express uncertainty about predictions. For example, if a neural network is trained to predict whether an image contains a cat or a dog and is given an elephant as input, it will not admit that it is unsure. With a relatively high probability the machine learning model will instead still choose cat or dog. For high-risk domains like healthcare and autonomous driving this is not the best approach. In these areas, the cost and damage caused by overconfident or underconfident predictions can be catastrophic.

Top of this page