Speaker
Dr. Feng is currently the Elizabeth & James Turner Fellow and Professor of Computer Science, Electrical & Computer Engineering, Health Sciences, and Biomedical Engineering and Mechanics at Virginia Tech (VT), where he directs the Synergy Lab and serves as a VT site co-director for the National Science Foundation Center for High-Performance Reconfigurable Computing (CHREC). Prior to VT, Feng had stints at Los Alamos National Laboratory, Ohio State University, Purdue University, University of Illinois at Urbana-Champaign, EnergyWare, Orion Multisystems, Vosaic, NASA Ames Research Center, and IBM T.J. Watson Research Center. Dr. Feng has published 250+ peer-reviewed technical publications in high-performance networking and computing, high-speed systems monitoring and measurement, low power and power-aware computing, computer science pedagogy for K-12, and bioinformatics. Of recent note, his biocomputing research on the Microsoft Cloud was featured in worldwide commercials (https://www.youtube.com/watch?v=GY2Bg0op-Kc) in 2015 and 2016 and now in a Microsoft AI commercial in 2018.
Dr. Feng holds a Ph.D. in Computer Science from the University of Illinois at Urbana-Champaign, a M.S. in Computer Engineering and B.S. degrees in Electrical & Computer Engineering and Music from Penn State University. In addition to being a Distinguished Scientist of the ACM and Senior Member of the IEEE Computer Society, Dr. Feng has also been named to HPCwire’s Top People to Watch twice, once in 2004 and again in 2011, and was recognized with an Outstanding Faculty Award bestowed by the Commonwealth of Virginia in 2014.
Abstract
Project MOON, short for MapReduce On Opportunistic eNvironments, proactively harvests the unused compute cycles of volatile computing resources (such as desktop computers in a company or transient compute resources in the field) and combines them with a small number of dedicated computing resources to provide the illusion of a robust supercomputer. MOON computing differs from cloud computing in three ways: (1) the adopted programming model for the MOON environment is MapReduce, the same technology that Google built to power its search engine, (2) MOON computing can be effectively realized on largely unreliable compute resources, and (3) MOON makes use of the idle or unused compute cycles within one’s own institution or enterprise, i.e., “in-sourcing” computing, versus “out-sourcing” the computation to cloud providers such as Amazon or RackSpace. The latter point substantially enhances the return on investment on institutional computing resources, whether they be in the classroom or out in the battlefield.