TY - JOUR

T1 - Equivalence in knowledge representation

T2 - Automata, recurrent neural networks, and dynamical fuzzy systems

AU - Giles, C. Lee

AU - Omlin, Christian W.

AU - Thornber, Karvel K.

PY - 1999

Y1 - 1999

N2 - Neurofuzzy systems - the combination of artificial neural networks with fuzzy logic - have become useful in many application domains. However, conventional neurofuzzy models usually need enhanced representational power for applications that require context and state (e.g., speech, time series prediction, control). Some of these applications can be readily modeled as finite state automata. Previously, it was proved that deterministic finite state automata (DFA) can be synthesized by or mapped into recurrent neural networks by directly programming the DFA structure into the weights of the neural network. Based on those results, a synthesis method is proposed for mapping fuzzy finite state automata (FFA) into recurrent neural networks. Furthermore, this mapping is suitable for direct implementation in very large scale integration (VLSI), i.e., the encoding of FFA as a generalization of the encoding of DFA in VLSI systems. The synthesis method requires FFA to undergo a transformation prior to being mapped into recurrent networks. The neurons are provided with an enriched functionality in order to accommodate a fuzzy representation of FFA states. This enriched neuron functionality also permits fuzzy parameters of FFA to be directly represented as parameters of the neural network. We also prove the stability of fuzzy finite state dynamics of the constructed neural networks for finite values of network weight and, through simulations, give empirical validation of the proofs. Hence, we prove various knowledge equivalence representations between neural and fuzzy systems and models of automata.

AB - Neurofuzzy systems - the combination of artificial neural networks with fuzzy logic - have become useful in many application domains. However, conventional neurofuzzy models usually need enhanced representational power for applications that require context and state (e.g., speech, time series prediction, control). Some of these applications can be readily modeled as finite state automata. Previously, it was proved that deterministic finite state automata (DFA) can be synthesized by or mapped into recurrent neural networks by directly programming the DFA structure into the weights of the neural network. Based on those results, a synthesis method is proposed for mapping fuzzy finite state automata (FFA) into recurrent neural networks. Furthermore, this mapping is suitable for direct implementation in very large scale integration (VLSI), i.e., the encoding of FFA as a generalization of the encoding of DFA in VLSI systems. The synthesis method requires FFA to undergo a transformation prior to being mapped into recurrent networks. The neurons are provided with an enriched functionality in order to accommodate a fuzzy representation of FFA states. This enriched neuron functionality also permits fuzzy parameters of FFA to be directly represented as parameters of the neural network. We also prove the stability of fuzzy finite state dynamics of the constructed neural networks for finite values of network weight and, through simulations, give empirical validation of the proofs. Hence, we prove various knowledge equivalence representations between neural and fuzzy systems and models of automata.

UR - http://www.scopus.com/inward/record.url?scp=0033360022&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033360022&partnerID=8YFLogxK

U2 - 10.1109/5.784244

DO - 10.1109/5.784244

M3 - Article

AN - SCOPUS:0033360022

VL - 87

SP - 1623

EP - 1640

JO - Proceedings of the Institute of Radio Engineers

JF - Proceedings of the Institute of Radio Engineers

SN - 0018-9219

IS - 9

ER -