From Newsgroup: comp.lang.prolog
Hi
Holy cow!
How it started:
The Implicit Mechanism of Attention
It may be that a mechanism of attention
results unexpectedly and implicitly from the
design of the topmost, transabstractive level.
Attention would work in the following manner.
Nolarbeit Theory Journal: Part Three of Three
by Arthur T. Murray - 14 JUL 1979
https://mind.sourceforge.net/theory3.html#1979apr11
How its going:
Attention Is All You Need
The dominant sequence transduction models are
based on complex recurrent or convolutional neural
networks in an encoder-decoder configuration. The best
performing models also connect the encoder and
decoder through an attention mechanism.
Ashish Vaswani et al. 12 Jun 2017
https://arxiv.org/abs/1706.03762
Bye
Kaz Kylheku schrieb:
Arthur Theodore Murray Obituary
"In Artificial Intelligence circles, "Mentifex"
(Arthur Theodore Murray) was bold and controversial.
He composed a "theory of the mind" and developed
AI based on his knowledge of classical languages.
He wrote "AI4U," "AI for Latin," and other books.
To his death, he wanted this work to be widely
understood and useful to others."
1946 - 2024 https://obituaries.seattletimes.com/obituary/arthur-murray-1089408830
--- Synchronet 3.21a-Linux NewsLink 1.2