LOD 2021 Best Paper Award
“An Integrated Approach to Produce Robust Deep Neural Network Models with High Efficiency”
Zhijian Li, University of California, Irvine, USA
Bao Wang, The University of Utah, USA
Jack Xin, University of California, Irvine, USA
Springer sponsors the LOD 2021 Best Paper Award with a cash prize of 1.000 Euro.
Special Mention:
“Statistical Estimation of Quantization for Probability Distributions: Best Equivariant Estimator of Principal Points“
Shun Matsuura, Keio University, Japan
Hiroshi Kurata, The University of Tokyo, Japan
“Neural Weighted A*: Learning Graph Costs and Heuristics with Differentiable Anytime A*“
Alberto Archetti, Marco Cannici and Matteo Matteucci
Politecnico di Milano, Italy
LOD 2021 Best Talk
“Go to Youtube and Call me in the Morning: use of Social Media for Chronic conditions“
Rema Padman 1, Xiao Liu 2, Anjana Susarla 3 and Bin Zhang 4
1 Carnegie Mellon University, USA
2 Arizona State University, USA
3 Michigan State University, USA
4 University of Arizona, USA
Past Awards
- LOD 2020:
- LOD 2020 Best Paper
“Quantifying Local Energy Demand through Pollution Analysis”
Cole Smith 1, Andrii Dobroshynskyi 1, and Suzanne McIntosh 1,2
1 Courant Institute of Mathematical Sciences, New York University, USA
2 Center for Data Science, New York University, USA - LOD 2020 Special Mention:
- “Sparsity Meets Robustness: Channel Pruning for the Feynman-Kac Formalism Principled Robust Deep Neural Nets”
Thu Dinh, Bao Wang, Andrea Bertozzi, Stanley Osher and Jack Xin,
University of California, Irvine – University of California, Los Angeles (UCLA). - “State Representation Learning from Demonstration”
Astrid Merckling, Alexandre Coninx, Loic Cressot, Stéphane Doncieux and Nicolas Perrin,
Sorbonne Université, Paris, France - “Sparse Perturbations for Improved Convergence in SZO Optimization”
Mayumi Ohta, Nathaniel Berger, Artem Sokolov and Stefan Riezler, Heidelberg University, Germany
- “Sparsity Meets Robustness: Channel Pruning for the Feynman-Kac Formalism Principled Robust Deep Neural Nets”
- LOD 2020 Best Talks:
- “A fast and efficient smoothing approach to LASSO regression and an application in statistical genetics: polygenic risk scores for Chronic obstructive pulmonary disease (COPD)”
Georg Hahn, Sharon Marie Lutz, Nilanjana Laha and Christoph Lange,
Department of Biostatistics, T.H. Chan School of Public Health, Harvard University, USA - “Gravitational Forecast Reconciliation”
Carla Freitas Silveira, Mohsen Bahrami, Vinicius Brei, Burcin Bozkaya, Selim Balcsoy, Alex “Sandy” Pentland, University of Bologna,Italy – MIT Media Laboratory, USA – Federal University of Rio Grande do Sul Brazil and MIT Media Laboratory, USA – New College of Florida, USA and Sabanci University, Turkey – Sabanci University, Turkey – Massachusetts Institute of Technology – MIT Media Laboratory, USA - “From Business Curated Products to Algorithmically Generated”
Vera Kalinichenko and Garima Garg, University of California, Los Angeles – UCLA, USA and FabFitFun, USA
- “A fast and efficient smoothing approach to LASSO regression and an application in statistical genetics: polygenic risk scores for Chronic obstructive pulmonary disease (COPD)”
- LOD 2020 Best Paper
- LOD 2019:
“Deep Neural Network Ensembles”
Sean Tao
Carnegie Mellon University, USA - LOD 2018:
“Calibrating the Classifier: Siamese Neural Network Architecture for End-to-End Arousal Recognition from ECG”
Andrea Patanè* and Marta Kwiatkowska*
*Department of Computer Science, University of Oxford, UK - MOD 2017:
“Recipes for Translating Big Data Machine Reading to Executable Cellular Signaling Models”
Khaled Sayed*, Cheryl Telmer**, Adam Butchy* & Natasa Miskov-Zivanov**
*University of Pittsburgh, USA **Carnegie Mellon University, USA - MOD 2016:
“Machine Learning: Multi-site Evidence-based Best Practice Discovery”
Eva Lee*, Yuanbo Wang and Matthew Hagen
*Professor Director, Center for Operations Research in Medicine and HealthCare H. Milton Stewart School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, GA, USA - MOD 2015:
“Learning with Discrete Least Squares on Multivariate Polynomial Spaces using Evaluations at Random or Low-Discrepancy Point Sets”
Giovanni Migliorati
Ecole Polytechnique Fédérale de Lausanne – EPFL, Lausanne, Switzerland