Publications

“Good Robot!”: Efficient Reinforcement Learning for Multi-Step Visual Tasks with Sim to Real Transfer

Published in IEEE Robotics and Automation Letters, 2020

Current Reinforcement Learning (RL) algorithms struggle with long-horizon tasks where time can be wasted exploring dead ends and task progress may be easily reversed. Read more

Recommended citation: A. Hundt, B. Killeen, H. Kwon, C. Paxton, GD Hager. "Good Robot!": Efficient Reinforcement Learning for Multi-Step Visual Tasks with Sim to Real Transfer. IEEE Robotics and Automation Letters, vol. 5, no. 4, pp. 6724–6731, Oct. 2020. doi: 10.1109/LRA.2020.3015448. https://dx.doi.org/10.1109/LRA.2020.3015448

Efficient Processing of Convolutional Neural Network Layers Using Analog-memory-based Hardware

Published in USPTO, 2020

(Patent.) According to one or more embodiments, a computer implemented method for implementing a convolutional neural network (CNN) using a crosspoint array… Read more

Recommended citation: G. W. Burr and B. D. Killeen. 2020. Efficient Processing of Convolutional Neural Network Layers Using Analog-memory-based Hardware. 20200117986, filed March 25, 2019, and issued April 16, 2020. https://uspto.report/patent/app/20200117986

A County-level Dataset for Informing the United States Response to COVID-19

Published in arXiv, 2020

As the coronavirus disease 2019 (COVID-19) continues to be a global pandemic, policy makers have enacted and reversed non-pharmaceutical interventions with various levels of restrictions to limit its spread. Read more

Recommended citation: B. D. Killeen*, J. Y. Wu*, K. Shah, A. Zapaishchykova, P. Nikutta, A. Tamhane, S. Chakraborty, J. Wei, T. Gao, M. Thies, M. Unberath. A County-level Dataset for Informing the United States Response to COVID-19. arXiv preprint, 2020, arXiv:2004.00756. https://arxiv.org/abs/2004.00756

Equivalent-accuracy Accelerated Neural-network Training Using Analogue Memory

Published in Nature, 2018

Neural-network training can be slow and energy intensive, owing to the need to transfer the weight data for the network between conventional digital memory chips and processor chips. Analogue non-volatile memory can accelerate the neural-network training algorithm known as backpropagation by performing parallelized multiply–accumulate operations in the analogue domain at the location of the weight data. Read more

Recommended citation: Ambrogio, S., Narayanan, P., Tsai, H. et al. Equivalent-accuracy accelerated neural-network training using analogue memory. Nature 558, 60–67 (2018). https://doi.org/10.1038/s41586-018-0180-5. https://doi.org/10.1038/s41586-018-0180-5