7

For years, I have been dealing with (and teaching) Knowledge Representation and Knowledge Representation languages. I just discovered that in another community (Information Systems and the such) there is something called the "DIKW pyramid" where they add another step after knowledge, namely wisdom. They define data as being simply symbols, information as being the answer to who/what/when/where?, knowledge as being the answer to how?, and wisdom as being the answer to why?.

My question is: has anyone done the connection between what AI calls data/information/knowledge and these notions from Information Systems? In particular, how would "wisdom" be defined in AI? And since we have KR languages, how would we represent "wisdom" as they define it?

Any references would be welcome.

nbro
  • 39,006
  • 12
  • 98
  • 176
yannis
  • 171
  • 2
  • For what it's worth, I found [this article](http://plato.stanford.edu/entries/wisdom/#WisEpiHum) pretty interesting. – DukeZhou Dec 01 '16 at 20:38

2 Answers2

2

As with another answer, I am also skeptical of the distinctions made in the DIKW pyramid.

Nonetheless, a very popular machine learning approach for answering 'Why?' questions is the application of Bayesian reasoning: given a causal data model, reverse inference can be used to find the probability distribution of events which lead to a given outcome.

It could be argued that defining 'cause' in terms of distributions rather than specific concrete mechanisms is a rather limited notion of 'Why?'.

However, it may be that there are some forms of causality that we don't know how to represent, specifically 'first-hand experience'. Indeed, common usage of the term 'wisdom' generally refers to first-hand experience, rather than information gained from some other source.

The idea is that knowledge can be expressed declaratively, whereas wisdom must be derived from experience.

For an AI represented as a computer program, the distinction between declarative and first-hand experience might appear irrelevant, since in principle any experience can be encoded and made available without the program having to 'experience' it first-hand.

However, the following humorous definition of `wisdom' might perhaps shed some light on a distinction that's pertinent to AI research:

Knowledge is knowing that a tomato is a fruit.

Wisdom is knowing that you shouldn't eat it with custard.

This notion of 'Wisdom' could be said to require qualia. It is the subject of much debate whether qualia exist and/or are necessary for consciousness - see for example the thought experiment of 'The Black and White Room'.

So the notion is that there is a distinction between having a Bayesian network representation of wisdom that says: "It is 99.7% likely that putting a tomato in custard is undesirable" and the first-hand experience to the effect that it tastes odd with custard.

NietzscheanAI
  • 7,206
  • 22
  • 36
  • "Wisdom" is also associated with good judgement, which may not require direct experience but merely knowledge of previous events or a deep understanding of a given issue. – DukeZhou Dec 01 '16 at 19:56
  • 1
    @DukeZhou The question of whether *all* such information can be obtained without first-hand experience is precisely the question that the 'Black and White room' seeks to address. – NietzscheanAI Dec 01 '16 at 20:24
  • It seems that wisdom in the sense of experience is only relevant in that it may contribute to making good decisions. The idea of complete information got me thinking about game AI, where wisdom could be defined as the ability to make optimal plays, even under conditions of incomplete information, chance, an/or mathematical or computational intractability. Game AI would seem to be "wise" enough to beat the strongest human players without the need for qualia or consciousness... – DukeZhou Dec 01 '16 at 21:48
0

I haven't done the connection - didn't know about the pyramid. I'm not sure it translates well into AI though.

It seems they're separating information from knowledge by splitting how from what. What is a superset of how, as far as I'm concerned. It's also a superset of why.

But from an evolutionary perspective, knowledge representation starts with why. Prior to a reason for knowledge representation, there is no knowledge representation. The 'what' existed, but it was not represented until autopoiesis created goal directed, why-oriented behaviors that began storing the what as knowledge.

What is a superset of why, just as ontology is a superset of teleology. However, all represented ontology was acquired through teleological (end-directed) action.

So I disagree with the notion that wisdom, as a why thing, is at the tip of the pyramid. It all started with goal directed behavior and that has been the source of all subsequent information growth.

So what is wisdom? I think it is too much of a folk term to warrant a technical definition. If I had to just take a swing at a definition, though, I'd probably vote for wisdom being knowledge of the ontological basis of one's own teleological knowledge - essentially objectifying one's subjective interpretations - knowing the true what and how of the why, to whatever extent is possible.

I don't have many specific references on this subject, but I thought Terrence Deacon's Incomplete Nature: How Mind Emerged from Matter was a good primer on teleology.

Doxosophoi
  • 1,945
  • 10
  • 11
  • So, if I get you right, you suggest that "wisdom" (in the sense of the DIKW pyramid) is just a subpart of knowledge and that we should represent it using the usual knowledge representation languages? – yannis Oct 02 '16 at 08:29
  • For example, I found this paper http://www.mecs-press.org/ijisa/ijisa-v7-n7/IJISA-V7-N7-3.pdf where authors first speak about the pyramid and then switch to well-known stuff on ontologies, but they unfortunately avoid to mention how wisdom should be represented. – yannis Oct 02 '16 at 10:04
  • Exactly. Data is data. Who, when, where, how _and_ why are all what-data. Why includes subjective preferences and goals. The non-why, ontological portion of the what-data approximates an objective representation of the external and internal world. Despite this, the particular encoding of that objective data is still highly subjective, driven by teleological forces. DNA has a highly subjective representation of its "factual" knowledge of the world. Humans are good at removing teleological, subjective bias from our ontological, factual knowledge. Wisdom is a subset of what-data, whatever it is. – Doxosophoi Oct 02 '16 at 19:09
  • As to what wisdom _actually_ is, I'm just as happy with NietzscheanAI's answer: _... knowledge can be expressed declaratively, whereas wisdom must be derived from experience._ – Doxosophoi Oct 02 '16 at 19:10
  • ... I've also heard that distinction in the past. Where that definition breaks down though is with something like a theorem and a proof, where both the writer and the reader have the experiential context of deriving the truth. Unless we can find a _technical_ distinction on knowledge to which we can apply the label "wisdom," I think it will continue to be a folk term. – Doxosophoi Oct 02 '16 at 19:18