default search action
SysML 2019: Stanford, CA, USA
- Ameet Talwalkar, Virginia Smith, Matei Zaharia:
Proceedings of the Second Conference on Machine Learning and Systems, SysML 2019, Stanford, CA, USA, March 31 - April 2, 2019. mlsys.org 2019 - Zhihao Jia, Matei Zaharia, Alex Aiken:
Beyond Data and Model Parallelism for Deep Neural Networks. - Wonkyung Jung, Daejin Jung, Byeongho Kim, Sunjung Lee, Wonjong Rhee, Jung Ho Ahn:
Restructuring Batch Normalization to Accelerate CNN Training. - Zhihao Jia, James Thomas, Todd Warszawski, Mingyu Gao, Matei Zaharia, Alex Aiken:
Optimizing DNN Computation with Relaxed Graph Substitutions. - Assaf Eisenman, Maxim Naumov, Darryl Gardner, Misha Smelyanskiy, Sergey Pupyrev, Kim M. Hazelwood, Asaf Cidon, Sachin Katti:
Bandana: Using Non-Volatile Memory for Storing Deep Learning Models. - Hyeontaek Lim, David G. Andersen, Michael Kaminsky:
3LC: Lightweight and Effective Traffic Compression for Distributed Machine Learning. - Michael Schaarschmidt, Sven Mika, Kai Fricke, Eiko Yoneki:
RLgraph: Modular Computation Graphs for Deep Reinforcement Learning. - Georgios Damaskinos, El-Mahdi El-Mhamdi, Rachid Guerraoui, Arsany Guirguis, Sébastien Rouault:
AGGREGATHOR: Byzantine Machine Learning via Robust Gradient Aggregation. - Paul N. Whatmough, Chuteng Zhou, Patrick Hansen, Shreyas K. Venkataramanaiah, Jae-sun Seo, Matthew Mattina:
FixyNN: Energy-Efficient Real-Time Mobile Computer Vision Hardware Acceleration via Transfer Learning. - Adam Lerer, Ledell Wu, Jiajun Shen, Timothée Lacroix, Luca Wehrstedt, Abhijit Bose, Alex Peysakhovich:
Pytorch-BigGraph: A Large Scale Graph Embedding System. - Anand Jayarajan, Jinliang Wei, Garth Gibson, Alexandra Fedorova, Gennady Pekhimenko:
Priority-based Parameter Propagation for Distributed DNN Training. - Qi Lei, Lingfei Wu, Pin-Yu Chen, Alex Dimakis, Inderjit S. Dhillon, Michael Witbrock:
Discrete Adversarial Attacks and Submodular Optimization with Applications to Text Classification. - Tian Zhao, Yaqi Zhang, Kunle Olukotun:
Serving Recurrent Neural Networks Efficiently with a Spatial Accelerator. - Akshay Agrawal, Akshay Naresh Modi, Alexandre Passos, Allen Lavoie, Ashish Agarwal, Asim Shankar, Igor Ganichev, Josh Levenberg, Mingsheng Hong, Rajat Monga, Shanqing Cai:
TensorFlow Eager: A multi-stage, Python-embedded DSL for machine learning. - Dibakar Gope, Ganesh Dasika, Matthew Mattina:
Ternary Hybrid Neural-Tree Networks for Highly Constrained IoT Applications. - Huizi Mao, Taeyoung Kong, Bill Dally:
CaTDet: Cascaded Tracked Detector for Efficient Object Detection from Video. - Jianyu Wang, Gauri Joshi:
Adaptive Communication Strategies to Achieve the Best Error-Runtime Trade-off in Local-Update SGD. - Yiren Zhao, Ilia Shumailov, Robert D. Mullins, Ross Anderson:
To Compress Or Not To Compress: Understanding The Interactions Between Adversarial Attacks And Neural Network Compression. - Minsik Cho, Ulrich Finkler, David S. Kung, Hillery C. Hunter:
BlueConnect: Decomposing All-Reduce for Deep Learning on Heterogeneous Network Hierarchy. - Maximilian Golub, Guy Lemieux, Mieszko Lis:
Full Deep Neural Network Training On A Pruned Weight Budget. - Sangkug Lym, Armand Behroozi, Wei Wen, Ge Li, Yongkee Kwon, Mattan Erez:
Mini-batch Serialization: CNN Training with Inter-layer Data Reuse. - Miguel Á. Carreira-Perpiñán, Mehdi Alizadeh:
Parmac: Distributed Optimisation Of Nested Functions, With Application To Learning Binary Autoencoders. - Jian Zhang, Ioannis Mitliagkas:
YellowFin and the Art of Momentum Tuning. - Daniel Smilkov, Nikhil Thorat, Yannick Assogba, Ann Yuan, Nick Kreeger, Ping Yu, Kangyi Zhang, Shanqing Cai, Eric Nielsen, David Soergel, Stan Bileschi, Michael Terry, Charles Nicholson, Sandeep N. Gupta, Sarah Sirajuddin, D. Sculley, Rajat Monga, Greg Corrado, Fernanda B. Viégas, Martin Wattenberg:
TensorFlow.js: Machine Learning For The Web and Beyond. - Cédric Renggli, Bojan Karlas, Bolin Ding, Feng Liu, Kevin Schawinski, Wentao Wu, Ce Zhang:
Continuous Integration of Machine Learning Models with ease.ml/ci: Towards a Rigorous Yet Practical Treatment. - Eric Breck, Neoklis Polyzotis, Sudip Roy, Steven Whang, Martin Zinkevich:
Data Validation for Machine Learning. - Jungwook Choi, Swagath Venkataramani, Vijayalakshmi Srinivasan, Kailash Gopalakrishnan, Zhuo Wang, Pierce Chuang:
Accurate and Efficient 2-bit Quantized Neural Networks. - Siyuan Ma, Mikhail Belkin:
Kernel Machines That Adapt To Gpus For Effective Large Batch Training. - Kallista A. Bonawitz, Hubert Eichner, Wolfgang Grieskamp, Dzmitry Huba, Alex Ingerman, Vladimir Ivanov, Chloé Kiddon, Jakub Konecný, Stefano Mazzocchi, Brendan McMahan, Timon Van Overveldt, David Petrou, Daniel Ramage, Jason Roselander:
Towards Federated Learning at Scale: System Design. - Dan Moldovan, James M. Decker, Fei Wang, Andrew A. Johnson, Brian K. Lee, Zachary Nado, D. Sculley, Tiark Rompf, Alexander B. Wiltschko:
AutoGraph: Imperative-style Coding with Graph-based Performance. - Christopher Canel, Thomas Kim, Giulio Zhou, Conglong Li, Hyeontaek Lim, David G. Andersen, Michael Kaminsky, Subramanya Dulloor:
Scaling Video Analytics on Constrained Edge Nodes. - Sayed Hadi Hashemi, Sangeetha Abdu Jyothi, Roy H. Campbell:
TicTac: Accelerating Distributed Deep Learning with Communication Scheduling. - Ting-Wu Chin, Ruizhou Ding, Diana Marculescu:
AdaScale: Towards Real-time Video Object Detection using Adaptive Scaling.
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.