3

I am interested in the emergence of properties in agents, and, more generally in robotics.

I was wondering if there is work on the emergence of time-related concepts, on the low-level representation of notions like before and after. I know, for example, that there is work on the emergence of spatial representation (similar to knn), or even communication* but time seems to be a tricky concept.

This has everything to do with the platform, i.e. the way that the representation would be coded in. We tend to favour ways that have some meaning or somehow mimic natural, well, yes, human structures, like the brain. I am not a neuroscientist and do not know that the sense of time looks like in humans, or if it is even present in other living beings.

Is there some work on the (emergence of the) representation of time in artificial agents?


*I remember watching a really cool... Actually creepy video from these robots but cannot find it anymore. Does anyone have the link at hand?

Luis
  • 538
  • 7
  • 15
  • The title is a bit confusing. Could you rephrase it? Also, do you mean explicit or implicit representation of time? Digging memory, but there is work around that! (I first thought about branching-time logic, but it is not emergent). – Eric Platon Aug 03 '16 at 06:43
  • @EricPlaton I wanted to rephrase, but am short of inspiration :P Got any suggestions? – Luis Aug 05 '16 at 14:51
  • I don't understand the question. Time is just the perception of change. How can a thinking thing do anything _other_ than perceive change? A recognition of a "rate of change" seems to be implicit in all things that deal with change, right? – Doxosophoi Sep 03 '16 at 01:22
  • @Doxosophoi I agree with the perception of change only partially: If I see a photo of two cakes and then another one with only one cake, I wouldn't know which one was first (or if they are the same cakes at all). – Luis Sep 03 '16 at 21:14
  • @Luis We feed cake pictures to agents in chronological order. Otherwise our cake-bots will never learn to bake cakes. In what sense is time not implicit in any given interactive agent? – Doxosophoi Sep 03 '16 at 21:50
  • @Doxosophoi Yes, but it is not *emergent*. I am asking about robots or AI-things which kind of... *Come to the idea* by themselves, with no previous human intervention ;) – Luis Sep 03 '16 at 21:58
  • @Luis Humans use words a lot to express their ideas and opinions. But, just because an agent may not have the word to articulate "time," that does not mean that the agent lacks opinions about the future, with respect to the present, which implies _some_ understanding of time. But yeah, if you want the _word_ "time" to emerge, then you'd need a system in which salient contexts are emergently given names, like a culture of talking agents. – Doxosophoi Sep 03 '16 at 22:10

1 Answers1

3

To my knowledge, this is very much an open research issue.

Here is a paper by Prof Leslie Smith, an acknowledged expert on neuromorphic perceptual coding, which explains the importance of the notion of perceptual time for Artificial General Intelligence and sketches an architecture from which a notion of 'now' might emerge: Perceptual Time.

NietzscheanAI
  • 7,206
  • 22
  • 36