See also my Google Scholar.

*Denotes first author or co-first author paper.

  1. S. Topan, D. Rolnick, X. Si, Techniques for Symbol Grounding with SATNet, preprint arXiv:2106.11072, 2021.
  2. L.A. Reisch, L. Joppa, P. Howson, A. Gil, P. Alevizou, N. Michaelidou, R. Appiah-Campbell, T. Santarius, S. Köhler, M. Pizzol, P.-J. Schweizer, D. Srinivasan, L.H. Kaack, P.L. Donti, D. Rolnick, Digitizing a sustainable future, One Earth 4(6):768-771, 2021.
  3. *B. Hanin, R. Jeong, D. Rolnick, Deep ReLU Networks Preserve Expected Length, preprint arXiv:2102.10492, 2021.
  4. *P.L. Donti, D. Rolnick, J.Z. Kolter, DC3: A learning method for optimization with hard constraints, International Conference on Learning Representations (ICLR) 2021.
  5. M. Skreta, S. Luccioni, D. Rolnick, Spatiotemporal Features Improve Fine-Grained Butterfly Image Classification, NeurIPS Workshop on Tackling Climate Change with Machine Learning, 2020.
  6. L.H. Kaack, P.L. Donti, E. Strubell, D. Rolnick, Artificial Intelligence and climate change: Opportunities, considerations, and policy levers to align AI with climate change goals, Heinrich Böll Foundation E-Paper, 2020.
  7. *D. Rolnick, K.P. Körding, Reverse-engineering deep ReLU networks, International Conference on Machine Learning (ICML) 2020.
  8. *D. Rolnick, A. Ahuja, J. Schwarz, T.P. Lillicrap, G. Wayne, Experience replay for continual learning, Conference on Neural Information Processing Systems (NeurIPS) 2019.
  9. *B. Hanin, D. Rolnick, Deep ReLU networks have surprisingly few activation patterns, Conference on Neural Information Processing Systems (NeurIPS) 2019.
  10. *D. Rolnick, P.L. Donti, L.H. Kaack, K. Kochanski, A. Lacoste, K. Sankaran, A.S. Ross, N. Milojevic-Dupont, N. Jaques, A. Waldman-Brown, A. Luccioni, T. Maharaj, E.D. Sherwin, S.K. Mukkavilli, K.P. Kording, C. Gomes, A.Y. Ng, D. Hassabis, J.C. Platt, F. Creutzig, J. Chayes, Y. Bengio, Tackling Climate Change with Machine Learning, preprint arXiv:1906.05433, 2019.
  11. *B. Hanin, D. Rolnick, Complexity of linear regions in deep networks, International Conference on Machine Learning (ICML) 2019.
  12. *D. Rolnick, J. Pouget-Abadie, K. Aydin, S. Kamali, V. Mirrokni, A. Najmi, Randomized experimental design via geographic clustering, Conference on Knowledge Discovery and Data Mining (KDD) 2019.
  13. A. Benjamin, D. Rolnick, K. Kording, Measuring and regularizing networks in function space, International Conference on Learning Representations (ICLR) 2019.
  14. Y. Meirovitch, L. Mi, H. Saribekyan, A. Matveev, D. Rolnick, C. Wierzynski, N. Shavit, Cross-classification clustering: An efficient multi-object tracking technique for 3-D instance segmentation in connectomics, Conference on Computer Vision and Pattern Recognition (CVPR) 2019.
  15. *D. Rolnick, E. Dyer, Generative models and abstractions for large-scale neuroanatomy datasets, Current Opinion in Neurobiology, 2019.
  16. *G. Spencer, D. Rolnick, On the robust hardness of Gröbner basis computation, Journal of Pure and Applied Algebra 223(5):2080-2100, 2019.
  17. *B. Hanin, D. Rolnick, How to start training: The effect of initialization and architecture, Conference on Neural Information Processing Systems (NeurIPS) 2018.
  18. *D. Rolnick, M.Tegmark, The power of deeper networks for expressing natural functions, International Conference on Learning Representations (ICLR) 2018.
  19. *R. Farhoodi, D. Rolnick, K. Kording, Neuron dendrograms uncover asymmetrical motifs, Computational and Systems Neuroscience (Cosyne) 2018.
  20. *D. Rolnick, A. Veit, S. Belongie, N. Shavit, Deep learning is robust to massive label noise, preprint arXiv:1705.10694, 2017.
  21. *D. Rolnick, Y. Meirovitch, T. Parag, H. Pfister, V. Jain, J.W. Lichtman, E.S. Boyden, N.~Shavit, Morphological error detection in 3D segmentations, Computational and Systems Neuroscience (Cosyne) 2018.
  22. H. Lin, M. Tegmark, D. Rolnick, Why does deep and cheap learning work so well?, Journal of Statistical Physics 168(6):1223-1247, 2017.
  23. *D. Rolnick, J. Bernstein, I. Dasgupta, H. Sompolinsky, Markov transitions between attractor states in a recurrent neural network, Computational and Systems Neuroscience (Cosyne) 2017.
  24. *D. Rolnick, P. Soberón, Quantitative (p,q)-theorems in combinatorial geometry, Discrete Mathematics 340(10):2516-2527, 2017.
  25. *J.A. De Loera, R.N. La Haye, D. Rolnick, and P. Soberón, Quantitative Tverberg theorems over lattices and other discrete sets, Discrete & Computational Geometry 58(2):435-448, 2017.
  26. *J.A. De Loera, R.N. La Haye, D. Rolnick, and P. Soberón, Quantitative combinatorial geometry for continuous parameters, Discrete & Computational Geometry 57(2):318-334, 2017.
  27. *D. Rolnick, On the classification of Stanley sequences, European Journal of Combinatorics 59:51-70, 2017.
  28. Y. Meirovitch, A. Matveev, H. Saribekyan, D. Budden, D. Rolnick, G. Odor, S. Knowles-Barley, T. Jones, H. Pfister, J.W. Lichtman, N. Shavit, A multi-pass approach to large-scale connectomics, preprint arXiv:1612.02120, 2016.
  29. *D. Rolnick, P. Soberón, Algorithmic aspects of Tverberg’s theorem, preprint arXiv:1601.03083, 2016.
  30. *R.A. Moy and D. Rolnick, Novel structures in Stanley sequences, Discrete Mathematics 339(2):689-698, 2016.
  31. *N. Golowich and D. Rolnick, Acyclic subgraphs of planar digraphs, Electronic Journal of Combinatorics 22(3):P3.7, 2015.
  32. *J.A. De Loera, S. Margulies, M. Pernpeintner, E. Riedl, D. Rolnick, G. Spencer, D. Stasi, and J. Swenson, Graph-coloring ideals: Nullstellensatz certificates, Groebner bases for chordal graphs, and hardness of Groebner bases, International Symposium on Symbolic and Algebraic Computation (ISSAC) 2015.
  33. *D. Rolnick and P. Venkataramana, On the growth of Stanley sequences, Discrete Mathematics 338(11):1928-1937, 2015.
  34. *D. Rolnick, The on-line degree Ramsey number of cycles, Discrete Mathematics 313(2):2084-2093, 2013.
  35. *D. Rolnick, Trees with an on-line degree Ramsey number of four, Electronic Journal of Combinatorics 18(1):P173, 2011.