OpenCog
OpenCog is a project that aims to build an open source artificial intelligence framework. OpenCog Prime is an architecture for robot and virtual embodied cognition that defines a set of interacting components designed to give rise to human-equivalent artificial general intelligence (AGI) as an emergent phenomenon of the whole system.[2] OpenCog Prime's design is primarily the work of Ben Goertzel while the OpenCog framework is intended as a generic framework for broad-based AGI research. Research utilizing OpenCog has been published in journals and presented at conferences and workshops including the annual Conference on Artificial General Intelligence. OpenCog is released under the terms of the GNU Affero General Public License.
Open Source Artificial Intelligence | |
Original author(s) | OpenCog Developers |
---|---|
Developer(s) | OpenCog Foundation |
Initial release | 21 January 2008[1] |
Repository | |
Written in | C++, Python, Scheme |
Platform | Linux |
Type | Artificial general intelligence |
License | GNU Affero General Public License |
Website | opencog |
OpenCog is in use by more than 50 companies, including Huawei and Cisco.[3]
Origin
OpenCog was originally based on the release in 2008 of the source code of the proprietary "Novamente Cognition Engine" (NCE) of Novamente LLC. The original NCE code is discussed in the PLN book (ref below). Ongoing development of OpenCog is supported by Artificial General Intelligence Research Institute (AGIRI), the Google Summer of Code project, Hanson Robotics, SingularityNET and others.
Components
OpenCog consists of:
- A graph database, dubbed the AtomSpace, that holds "atoms" (that is, terms, atomic formulas, sentences and relationships) together with their "values" (valuations or interpretations, which can be thought of as per-atom key-value databases). An example of a value would be a truth value. Atoms are globally unique, immutable and are indexed (searchable); values are fleeting and changeable.
- A collection of pre-defined atoms, termed Atomese, used for generic knowledge representation, such as conceptual graphs and semantic networks, as well as to represent and store the rules (in the sense of term rewriting) needed to manipulate such graphs.
- A collection of pre-defined atoms, that encode a type subsystem, including type constructors and function types. These are used to specify the types of variables, terms and expressions, and are used to specify the structure of generic graphs containing variables.
- A collection of pre-defined atoms that encode both functional and imperative programming styles. These include the lambda abstraction for binding free variables into bound variables, as well as for performing beta reduction.
- A collection of pre-defined atoms that encode a satisfiability modulo theories solver, built in as a part of a generic graph query engine, for performing graph and hypergraph pattern matching (isomorphic subgraph discovery). This generalizes the idea of a structured query language (SQL) to the domain of generic graphical queries; it is an extended form of a graph query language.
- A generic rule engine, including a forward chainer and a backward chainer, that is able to chain together rules. The rules are exactly the graph queries of the graph query subsystem, and so the rule engine vaguely resembles a query planner. It is designed so as to allow different kinds of inference engines and reasoning systems to be implemented, such as Bayesian inference or fuzzy logic, or practical tasks, such as constraint solvers or motion planners.
- An attention allocation subsystem based on economic theory, termed ECAN.[4] This subsystem is used to control the combinatorial explosion of search possibilities that are met during inference and chaining.
- An implementation of a probabilistic reasoning engine based on probabilistic logic networks (PLN). The current implementation uses the rule engine to chain together specific rules of logical inference (such as modus ponens), together with some very specific mathematical formulas assigning a probability and a confidence to each deduction. This subsystem can be thought of as a certain kind of proof assistant that works with a modified form of Bayesian inference.
- A probabilistic genetic program evolver called Meta-Optimizing Semantic Evolutionary Search, or MOSES[5]. This is used to discover collections of short Atomese programs that accomplish tasks; these can be thought of as performing a kind of decision tree learning, resulting in a kind of decision forest, or rather, a generalization thereof.
- A natural language input system consisting of Link Grammar, and partly inspired by both Meaning-Text Theory as well as Dick Hudson's Word Grammar, which encodes semantic and syntactic relations in Atomese.
- A natural language generation system [6].
- An implementation of Psi-Theory for handling emotional states, drives and urges, dubbed OpenPsi.[7]
- Interfaces to Hanson Robotics robots, including emotion modelling[8] via OpenPsi. This includes the Loving AI project, used to demonstrate meditation techniques.
Organization and funding
In 2008, the Machine Intelligence Research Institute (MIRI), formerly called Singularity Institute for Artificial Intelligence (SIAI), sponsored several researchers and engineers. Many contributions from the open source community have been made since OpenCog's involvement in the Google Summer of Code in 2008 and 2009. Currently MIRI no longer supports OpenCog.[9] OpenCog has received funding and support from several sources, including the Hong Kong government, Hong Kong Polytechnic University, the Jeffrey Epstein VI Foundation[10] and Hanson Robotics. The OpenCog project is currently affiliated with SingularityNET and Hanson Robotics.
Sources
- Hart, D; B Goertzel (2008). OpenCog: A Software Framework for Integrative Artificial General Intelligence (PDF). Proceedings of the First AGI Conference. Gbooks
- Goertzel, B., Iklé, M., Goertzel, I.F., Heljakka, A. Probabilistic Logic Networks, A Comprehensive Framework for Uncertain Inference, Springer, 2009, VIII, 336 p., Hardcover ISBN 978-0-387-76871-7
References
- "OpenCog Release". 21 January 2008. Retrieved 21 January 2008.
- "OpenCog: Open-Source Artificial General Intelligence for Virtual Worlds | CyberTech News". 2009-03-06. Archived from the original on 2009-03-06. Retrieved 2016-10-01.CS1 maint: BOT: original-url status unknown (link)
- Rogers, Stewart (2017-12-07). "SingularityNET talks collaborative AI as its token sale hits 400% oversubscription". venturebeat.com. VentureBeat. Retrieved 2018-03-13.
- "Economic Attention Allocation".
- "MOSES".
- "Natural Language Generation".
- "OpenPsi".
- "Archived copy". Archived from the original on 2018-03-19. Retrieved 2015-04-24.CS1 maint: archived copy as title (link)
- Ben Goertzel (2010-10-29). "The Singularity Institute's Scary Idea (and Why I Don't Buy It)". The Multiverse According to Ben. Retrieved 2011-06-24.
- "Science Funder Jeffrey Epstein Launches Radical Emotional Software". Forbes. Oct 2, 2013.
External links
- Official website
- OpenCog Wiki
- AGI 2011: OpenCog - GoogleTechTalks on YouTube
- AGI 2011: Architectures Part I - GoogleTechTalks on YouTube
- Artificial General Intelligence: Now Is the Time - 2007 GoogleTechTalks on YouTube
- CogPrime: An Integrative Architecture for Embodied Artificial General Intelligence
- OpenCog: An Open Source Software Framework & A Design & Vision for Advanced AGI. Video on YouTube Given at Monash University Australia, Sept 2011. Adam Ford
- Video introduction to OpenCog by Ben Goertzel Video on YouTube. Ben speaks on OpenCog in Tai Po, Hong Kong, Dec 2011. Adam Ford
- Ben Goertzel - the future of AGI - Open Cog development in Asia Video on YouTube Adam Ford