2.1 On the Complexity of Neural-Network-Learned Functions

Monday, 23 January 2017: 4:00 PM
310 (Washington State Convention Center )
Caren Marzban, University of Washington, Seattle, WA; and R. Viswanathan

Handout (1.1 MB)

There exists a large body of knowledge on the assessment of the complexity of functions that are learnable by Neural Networks (NNs). These works are generally highly theoretical, and consequently, not directly useful for most NN practitioners. In this talk, we will consider a completely binary problem (i.e., with binary input notes, a binary output node, binary hidden nodes, and binary weights). For such problems, the number of possible (Boolean) functions relating the inputs to the output is finite and countable. Here, we will argue that the number of weight configurations that represent each function is a measure of the complexity of that function. This simple set-up allows one not only to understand some of the aforementioned theoretical results, but it also allows one to ask, and answer, practical questions like "How complex is this function that my NN has learned from data?"
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner