98%
921
2 minutes
20
Interaction computing (IC) aims to map the properties of integrable low-dimensional non-linear dynamical systems to the discrete domain of finite-state automata in an attempt to reproduce in software the self-organizing and dynamically stable properties of sub-cellular biochemical systems. As the work reported in this paper is still at the early stages of theory development it focuses on the analysis of a particularly simple chemical oscillator, the Belousov-Zhabotinsky (BZ) reaction. After retracing the rationale for IC developed over the past several years from the physical, biological, mathematical, and computer science points of view, the paper presents an elementary discussion of the Krohn-Rhodes decomposition of finite-state automata, including the holonomy decomposition of a simple automaton, and of its interpretation as an abstract positional number system. The method is then applied to the analysis of the algebraic properties of discrete finite-state automata derived from a simplified Petri net model of the BZ reaction. In the simplest possible and symmetrical case the corresponding automaton is, not surprisingly, found to contain exclusively cyclic groups. In a second, asymmetrical case, the decomposition is much more complex and includes five different simple non-abelian groups whose potential relevance arises from their ability to encode functionally complete algebras. The possible computational relevance of these findings is discussed and possible conclusions are drawn.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.biosystems.2013.03.003 | DOI Listing |
ISA Trans
August 2025
School of Automation and Electrical Engineering, Zhejiang University of Science and Technology,Hangzhou, 310023, China. Electronic address:
This paper addresses the task-priority projection problem within the Null-Space-based Behavioral Control (NSBC) framework. We propose a Reliable Intelligent Task Supervisor (RITS) for dynamic task-priority projection in networked Nonholonomic Mobile Robots (NMRs). Firstly, the NSBC task paradigm is extended to accommodate the NMRs by integrating nonholonomic constraints into both task functions and the projection operator.
View Article and Find Full Text PDFSci Rep
July 2025
Department of Computer Science, Oklahoma State University, Stillwater, 74075, USA.
Quantum computing leverages unitary matrices to perform reversible computations while preserving probability norms. However, many real-world applications involve non-unitary sparse matrices, posing a challenge for quantum implementation. This paper introduces a novel method for transforming a class of non-unitary sparse binary matrices into higher-dimensional permutation matrices, ensuring unitarity.
View Article and Find Full Text PDFFront Comput Neurosci
May 2023
Cisco Secure Workload, Cisco, San Jose, CA, United States.
How do humans learn the regularities of their complex noisy world in a robust manner? There is ample evidence that much of this learning and development occurs in an unsupervised fashion interactions with the environment. Both the structure of the world as well as the brain appear hierarchical in a number of ways, and structured hierarchical representations offer potential benefits for efficient learning and organization of knowledge, such as concepts (patterns) sharing parts (subpatterns), and for providing a foundation for symbolic computation and language. A major question arises: what drives the processes behind acquiring such hierarchical spatiotemporal concepts? We posit that the goal of advancing one's predictions is a major driver for learning such hierarchies and introduce an information-theoretic score that shows promise in guiding the processes, and, in particular, motivating the learner to build larger concepts.
View Article and Find Full Text PDFFront Neurosci
January 2023
Redwood Center for Theoretical Neuroscience, University of California, Berkeley, Berkeley, CA, United States.
Operations on high-dimensional, fixed-width vectors can be used to distribute information from several vectors over a single vector of the same width. For example, a set of key-value pairs can be encoded into a single vector with multiplication and addition of the corresponding key and value vectors: the keys are bound to their values with component-wise multiplication, and the key-value pairs are combined into a single superposition vector with component-wise addition. The superposition vector is, thus, a memory which can then be queried for the value of any of the keys, but the result of the query is approximate.
View Article and Find Full Text PDFIEEE Trans Pattern Anal Mach Intell
June 2023
Recurrent neural networks are a widely used class of neural architectures. They have, however, two shortcomings. First, they are often treated as black-box models and as such it is difficult to understand what exactly they learn as well as how they arrive at a particular prediction.
View Article and Find Full Text PDF