*Ph.D. Thesis,
Graduate School of Mathematical Sciences,
University of Tokyo,
March 2000.
*

Learning systems that learn from previous experiences and/or provided
examples of appropriate behaviors, allow the people to specify
*what* the systems should do for each case, not *how* systems
should act for each step. That eases system users' burdens to a great
extent.

It is essential in efficient and accurate learning for supervised learning systems such as neural networks to be able to utilize knowledge in the forms of such as logical expressions, probability distributions, and constraint on differential data along with provided desirable input and output pairs.

Neural networks, which can learn constraint on differential data, have already been applied to pattern recognition and differential equations. Other applications such as robotics have been suggested as applications of neural networks learning differential data.

In this dissertation, we investigate the extended framework introduce constraints on differential data into neural networks' learning. We also investigate other items that form the foundations for the applications of neural networks learning differential data.

First, new and very general architecture and an algorithm are introduced for multilayer perceptrons to learn differential data The algorithm is applicable to learning differential data of orders not only first but also higher than first and completely localized to each unit in the multilayer perceptrons like the back propagation algorithm.

Then the architecture and the algorithm are implemented as computer programs. This required high programming skills and great amount of care. The main module is programmed in C++.

The implementation is used to conduct experiments among others to show convergence of neural networks with differential data of up to third order.

Along with the architecture and the algorithm, we give analyses of neural networks learning differential data such as comparison with extra pattern scheme, how learnings work, sample complexity, effects of irrelevant features, and noise robustness.

A new application of neural networks learning differential data to continuous action generation in reinforcement learning and its experiments using the implementation are described. The problem is reduced to realization of a random vector generator for a given probability distribution, which corresponds to solving a differential equation of first order.

In addition to the above application to reinforcement learning, two other possible applications of neural networks learning differential data are proposed. Those are differential equations and simulation of human arm. For differential equations, we propose a very general framework, which unifies differential equations, boundary conditions, and other constraints. For the simulation, we propose a natural neural network implementation of the minimum-torque-change model.

Finally, we present results on higher order extensions to radial basis function (RBF) networks of minimizing solutions with differential error terms, best approximation property of the above solutions, and a proof of $C^l$ denseness of RBF networks.

Through these detailed accounts of architecture, an algorithm, an implementation, analyses, and applications, this dissertation as a whole lays the foundations for applications of neural networks learning differential data as learning systems and will help promote their further applications.

Ryusuke Masuoka, Michio Yamada, "Neural Networks Which Learn From Differential Data", Technical Report of IEICE, NC94-66 (1995-02), pp 49-56, February, 1995. (In Japanese, PDF file, Gzipped PS file)

Ryusuke Masuoka, "Noise Robustness of EBNN learning," Proceedings of 1993 International Joint Conference on Neural Networks, Nagoya, Japan, October 25-29, 1993, Vol. 2, pp. 1665--1668. (PDF File, Gzipped PS file )