Draft:Algebraic Machine Learning
Submission declined on 23 July 2025 by Stuartyeates (talk). This is not a coherent summary of this field. Drop the bogus comparison with ANN and focus on explaining what this is.
Where to get help
How to improve a draft
You can also browse Wikipedia:Featured articles and Wikipedia:Good articles to find examples of Wikipedia's best writing on topics similar to your proposed article. Improving your odds of a speedy review To improve your odds of a faster review, tag your draft with relevant WikiProject tags using the button below. This will let reviewers know a new draft has been submitted in their area of interest. For instance, if you wrote about a female astronomer, you would want to add the Biography, Astronomy, and Women scientists tags. Editor resources
| ![]() |
In Symbolic AI, Algebraic Machine Learning (AML) is a machine learning method that uses abstract algebra and model theory as its foundational framework, without relying on optimization or statistical techniques.
In AML, learning, from both data and goals, are encoded as algebraic identities within an algebraic structure, typically a semilattice.
The method involves computing a model of these identities, which is decomposed as a subdirect product.
Among all possible models, the freest model corresponds to the formal logical deductive closure of the identities.
Generalization occurs by discarding factors in the subdirect decomposition of the freest model.
AML has been applied on data-driven pattern recognition image classification on benchmark datasets such as CIFAR-10, MNIST.
I has also been used in formal problems, such as finding Hamiltonian cycles in graphs, sudokus, and the n-queens completion problem.
AML learns directly from their specifications without relying on explicit search algorithms.
Overview
[edit]AML relies on model theory to represent problems as algebraic structures[1]. Data, constraints, and rules are encoded using first order logic, resulting in a set of algebraic sentences. During training, the sparse crossing algorithm[2] iteratively modifies and evolves these algebraic structures to model the target problem. AML does not rely on optimization or error minimization, instead generalization is obtained by finding a small set of components that model the positive and negative algebraic sentences, as smaller models are found to generalize better[2].
Semantic embedding
[edit]Semantic embedding is the process of encoding the data, constraints and rules of a problem as algebraic sentences. This set forms its algebraic theory and can be seen as the union of all positive and negative algebraic sentences, .
The embedding constants describe the smallest concepts of a problem that can be used to describe more complex concepts. E.g. "pixel at position (3,4) is red", "label: it is a boat", or "graph edge: node 3 12". The embedding constants can be seen as the alphabet used for the semantic embedding.
Terms describe complex ideas and are built as the idempotent union of embedding constants. E.g. an image is a collection of pixels covering the whole space, and a graph is a collection of constants representing nodes and vertices joining them.
An algebraic sentence establishes an order relation between two terms. E.g. certain set of pixels form the image of a boat,
Sparse crossing
[edit]Characteristics
[edit]AML displays the following characteristics:
- No overfitting.
- Continuous and discrete symbols.
- Data can be combined with formal knowledge. Constraints, rules, and goals can be encoded alongside data.
- A single algorithm (sparse crossing) and multi-modality by construction.
- Model additivity: independent models can be combined allowing for distributed AI, horizontal scalability of the computation, and quantum machine learning[3]
See also
[edit]References
[edit]- ^ Burris, Stanley; Sankappanavar, Hanamantagouda (1981). A course in universal algebra. Springer New York, NY. ISBN 978-1-4613-8132-7.
- ^ a b Martin-Maroto, Fernando; G. de Polavieja, Gonzalo (2018). "Algebraic Machine Learning". arXiv:1803.05252 [cs.LG].
- ^ Malov, Dmitrii (August 2020). "Quantum Algebraic Machine Learning". 2020 IEEE 10th International Conference on Intelligent Systems (IS). pp. 426–430. doi:10.1109/IS48319.2020.9199982. ISBN 978-1-7281-5456-5.
- "Algebraic AI (slides)" (PDF). AITP 2024. Retrieved 22 July 2025.
- Martin-Maroto, Fernando; Abderrahaman, Nabil; Méndez, David; G. de Polavieja, Gonzalo (2025). "Algebraic Machine Learning: Learning as computing an algebraic decomposition of a task". arXiv:2502.19944 [cs.LG].
- Haidar, Imane M.; Sliman, Layth; Damaj, Issam W.; Haidar, Ali Massoud (2024). "High Performance and Lightweight Single Semi-Lattice Algebraic Machine Learning". IEEE Access. 12: 50517–50536. Bibcode:2024IEEEA..1250517H. doi:10.1109/ACCESS.2024.3376525.
- Haidar, Imane M.; Sliman, Layth; Damaj, Issam W.; Haidar, Ali M. (2024). "Legacy Versus Algebraic Machine Learning: A Comparative Study". 2nd International Congress of Electrical and Computer Engineering. EAI/Springer Innovations in Communication and Computing. pp. 175–188. doi:10.1007/978-3-031-52760-9_13. ISBN 978-3-031-52759-3.