From Molecule to Energy to Metaphor: A Novel "Energy-Based" Approach to Embodied, Brain-Derived Deep Commonsense, Cultural, and Emotional Simulation and Reasoning
Daniel J. Olsher
Carnegie Mellon University and Cognitive Science Program, Temasek Labs, Singapore
Tuesday, October 22, 2013
12:30 p.m., Conference Room 5A
Abstract:
Is ‘bread’ likely to be considered ‘tasty’? What about in France? If someone kicked you, how would that make you feel?
If I’m angry now, what may have just happened? And just what does ‘dog’ actually ‘mean’?
Providing a highly novel perspective on non-logical reasoning, EBKR allows deep simulation of nuanced, contextualized embodied meaning. In NLP, it has been empirically shown to give better than state-of-the-art results in document topic modeling (vs. Latent Dirichlet Analysis) and category information extraction and construction mining/pattern extraction coverage (vs. OLLIE and ReVERB). In AI, we have working demos of EBKR’s ability to model and simulate culture, emotion, the emotional consequences of actions, worldviews, gisting, and some other key AI tasks. We also have a working deep-semantics Construction Grammar parser.
We have built initial demos of solutions to interesting AI problems such as goal inference (if I have a knife and fork, what do I plan to do?), future and past prediction, agent-based negotiation reasoning, automated lexical item sense induction, automated category induction, semantic category membership calculation, gisting, topic modeling, metaphor, and other problems on top of our single highly flexible representation and ~9 million-item commonsense knowledge base.
Our commonsense knowledge base offers strong resistance to noisy data and can transparently fuse data from diverse sources and domains. It allows the introduction of new (and possibly contradictory) KB knowledge without affecting pre-existing reasoning capabilities. The KB includes nearly all existing open source commonsense knowledge sources (ConceptNet, WordNet, NELL, Probase, DBPedia, and more) translated into the INTELNET EBKR formalism.
We have initial working framing, political, and worldview/culture-based persuasion models, and all of our work focuses in some way on unconscious processing. It can also be applied to priming and Lakoffian approaches to framing and political processes.
Our system is highly computationally tractable, allowing thousands of pieces of knowledge to be combined efficiently on a single CPU.
Using hands-on demonstrations, the talk motivates the need for novel approaches to AI representation and reasoning, presents results, and discusses potential future directions (metaphor mappings, NLP, intelligent agents with IQ and CQ, and more).
Bio:
Specializing in ambiguous, difficult-to-model social and commonsense phenomena, over 15 years Daniel Olsher’s research at Carnegie Mellon, Georgetown, DSO National Labs, and elsewhere has included nuanced knowledge/cultural representation and reasoning, commonsense knowledge bases and inference, cross-domain data sets, robust knowledge for data mining, full-semantics NLP, AI-based conflict resolution, and social systems models. He has demonstrated precise reasoning over fine-grained models of Chinese, Sub-Saharan African, and Persian culture, norms, trust, emotion, politics, and social systems.
Beyond EBKR and inference/reasoning/gisting/categorization, Daniel develops commonsense-based culturally- and emotionally- aware intelligent agents coupled with real-world AI systems for predicting civilian and foreign force reactions. Ongoing humanitarian applications include culturally-aware decisionmaking, reducing stereotype-based discrimination, and intelligent generation of persuasive communications grounded in pre-existing beliefs regarding human rights.