[MWForum]Neural Nets

Daniel Ajoy mwforum@lists.mathcats.com
Wed, 22 May 2002 13:22:59 -0500


On 22 May 2002 at 9:30, Gary McCallister wrote:

>      Is anyone on the forum familiar with neural nets?  Has anyone tried to program one in MW?   I am starting to learn a little about neural nets and was thinking that it would be interesting and fun to program a single artificial neuron in MW.  Before jumping in with both feet I though I 
would see if I was re-inventing the wheel, or if anyone might have any suggestions for an approach.
> 
> 

I saved this very very old post about a "perceptron". Never attempted to
understand it.

Daniel



                       Logo Archive: LOGO-L> A Perceptron
[Back to GSN Home Page] [1]  (link)[Back to Message Index]*
--------------------------------------------------------------------------------

Global SchoolNet - Automated Message Archive

LOGO-L> A Perceptron

--------------------------------------------------------------------------------
Ed Wall (/ewall@wis.com/)
/Sun, 13 Aug 1995 18:50:35 -0900/

  * *Messages sorted by:* [ date ] [2] [ thread ] [3] [ subject ] [4] [ author
    ] [5]
  * *Next message:* Dorothy Fitch: "LOGO-L> Logo PLUS arrays" [6]
  * *Previous message:* Keith A. Gum 904-628-2953: "LOGO-L> message test" [7]
  * *Next in thread:* KERRB@Magill.UniSA.edu.au: "Re: LOGO-L> A Perceptron" [8]

Okay folks

Here is a perceptron. It runs under UCBLogo on a Mac and I will transfer
it over to LogoPlus on an Apple II (at least I think I can). Hopefully UCB
is not too esoteric (I notice that except for arrays the words I used are
comon to LogoPlus). Comments, complaints, and IMPROVEMENTS are welcome. Please
note that it takes awhile to generate the neuralnet array so don't think your
machine has crashed.

It is relatively inefficient and requires a lot of memory. I do have
a three layer perceptron to be finished. Ideally it would work somewhat like
the above. I think my students might enjoy drawing objects and seeing if
they could be categorized. Hmm, that raises some interesting possibilities.

Ed Wall

P.S. The only choice is Automatic training at present. The necessary extension
is obvious and left to the user :-)

-------- cut here
;
; Perceptrons & Neural Nets
; January 1987 AI EXPERT
; by Peter Reece

; modified for Logo by Ed Wall, August 1995

; PERCEPTRON VISION SYSTEM SIMULATION, Peter Reece 1986
; IMAGE() = the sensory grid array
; NEURALNET = the associative net - neural interconnections
; SIZE**2 = number of cells in the sensory grid
; SCAN = number of cells required to construct an 8-bit address
; into the array NEURALNET()
; LOOPSCAN = the number of iterations for scanning the sensory
; grid - i.e. we look at scan cells at random
; loopscan times

Make "image mdarray [16 16]
Make "loopscan 32
;Make "neuralnet mdarray [8192 2]
Make "scan 8
Make "size 16

to askit :x
type :x spaces 1
make "q$ readword
test (empty? :q$)
iffalse [make "q$ first :q$]
print []
end

to message
cleartext
spaces 5
PRINT [This program demonstrates how a very simple]
PRINT [percepeptron is capable of analysing visual information.]
PRINT [] PRINT [] PRINT []
PRINT [Proceed as follows: ]
PRINT [] PRINT []
PRINT [1) Draw an object and decide if that object is a member of]
spaces 3
PRINT [a ojbect class one or two. Eg. A cup, saucer, and ]
spaces 3
PRINT [plate might be class 1, a crayon class 2.]
PRINT [2) Train the perceptron to recognize objects]
spaces 3
PRINT [of a particular class by drawing various objects]
spaces 3
PRINT [from both classes.]
PRINT [3) Present various objects to the perceptron, (some]
spaces 3
PRINT [old objects may be used, as well as those that it]
spaces 3
PRINT [has never seen before), and see how successfully it]
spaces 3
PRINT [classifies new ojects as beloging to the correct class.]
PRINT [] PRINT []
make "q$ [Press [enter] to begin a training session.]
askit :q$
end

to spaces :x
repeat :x [type char 32]
end

to startup
message
cleartext
training
classification
end

to classification
; **************** Classification Session ******************
cleartext
print [Hopefully, this will be a temporary kludge. Ideally, the user]
print [should draw their own object and ask the perceptron to classify it]
print []
print [Sigh!! But at this point all that happens is that the computer ]
print [will generate a number of vertical and horizontal lines at random]
print [and classify them]
print []
spaces 20
print [======== Classification Sesson =========]
pr []
(rerandom 5)
type [How many objects to test?] spaces 1
make "n# readword
testnet 1
end

to getnet :x
local "idx
local "i
local "j
if :x > :loopscan [stop]
make "index :size*:size*(:x - 1) + 1
getindex 1 1
make "idx list :index 1
make "i mditem :idx :neuralnet
if (empty? :i) [make "i 0]
make "idx list :index 2
make "j mditem :idx :neuralnet
if (empty? :j) [make "j 0]
if (:i = 1) [make "member (:member + 1)]
if (:j = 1) [make "nonmember (:nonmember + 1)]
make "i (:i) + (:j)
; neither is one means the null class
;(if :i = 0 [getnet :x] [getnet (:x + 1)])
getnet (:x + 1)
end

to testnet :x
local "i
if (:x > :n#) [stop]
make "member 0
make "nonmember 0
drawgrid :x
make "i random 2
(if (:i = 0) [randomh] [randomv])
getnet 1
(if (:nonmember=:member) [equalmessage] [notequalmessage])
testnet (:x + 1)
end

to notequalmessage
type [Test] spaces 1 type :x type [:] spaces 2
type [Ratio is] spaces 1 type :member type[/] type :nonmember spaces 2
type [favoring class] spaces 1
(if (:nonmember>:member) [pr [Two]] [pr [One]])
end

to equalmessage
type [Test] spaces 1 type :x type [:] spaces 2
type [Ratio is] spaces 1 type :member type[/] type :nonmember spaces 2
pr [Can't decide!]
end

to zeroimage :i :j
local "idx
if :i=(:size + 1) [stop]
if :j=(:size + 1) [zeroimage (:i + 1) 1]
if :j=(:size + 1) [stop]
make "idx list :i :j
mdsetitem :idx :image 0
zeroimage :i (:j + 1)
end

to setimagev :x :y :z
local "idx
if :y > :z [stop]
make "idx list :x :y
mdsetitem :idx :image 1
setimageh :x (:y + 1) :z
end

to setimageh :x :y :z
local "idx
if :x > :z [stop]
make "idx list :x :y
mdsetitem :idx :image 1
setimageh (:x + 1) :y :z
end

to getindex :x :fact
; Calculate an SCAN digit address into NEURALNET()
; by scanning any 8 cells of IMAGE() at random
; If a cell has an active pixel, it is considered on,
; else it is considered off. Hence a SCAN digit binary address.
local "firstd
local "secondb
local "idx
local "incr
if :x > :scan [stop]
make "firstb random :size
make "firstb (:firstb + 1)
make "secondb random :size
make "secondb (:secondb + 1)
make "idx list :firstb :secondb
make "incr mditem :idx :image
make "index (:index + :incr*:fact)
getindex (:x + 1) 2*:fact
end

to setnet :x
local "idx
if :x > :loopscan [stop]
make "index :size*:size*(:x - 1) + 1
getindex 1 1
make "idx list :index :class
mdsetitem :idx :neuralnet 1
setnet (:x + 1)
end

to drawgrid :x
clearscreen pu ht
setxy 0 100 rt 90
pd label :x pu fd 8 pd label [of] pu fd 14 pd label :n# pu
home setxy 0 0
pd rt 90 fd 16*5 lt 90 fd 16*5 pu
setxy 0 (-10)
end

to autoone :x
if (:x > :n#) [stop]
drawgrid :x
pd label [Object Class One] pu
randomh
setnet 1
autoone (:x + 1)
end

to randomh
local "klen
local "vpos
local "hpos
; Create one horizontal line of length k
zeroimage 1 1
; line length varies from 2 to size-1
make "klen random (:size - 2)
make "klen (:klen + 2)
; vertical start varies from 1 to size
make "vpos random :size
make "vpos (:vpos + 1)
; 0<=horizontal start+k<=size-1
make "hpos random (:size - :klen)
home setxy :hpos*5 :vpos*5
pd rt 90 fd :klen*5 pu
setimageh (:hpos + 1) :vpos (:klen + :hpos + 1)
; Now place this image into neuralnet
setxy 0 (-20)
pd label [Scanning object] pu
end

to autotwo :x
if (:x > :n#) [stop]
drawgrid :x
pd label [Object Class Two] pu
randomv
setnet 1
autotwo (:x + 1)
end

to randomv
local "klen
local "vpos
local "hpos
; Create one vertical line of length k
zeroimage 1 1
; line length varies from 2 to size-1
make "klen random (:size - 2)
make "klen (:klen + 2)
; horizontal start varies from 0 to size-1
make "hpos random :size
; 1<=vertical start+k<=size
make "vpos random (:size - :klen)
make "vpos (:vpos + 1)
home setxy :hpos*5 :vpos*5
pd fd :klen*5 pu
setimagev :hpos+1 :vpos (:klen + :vpos)
; Now place this image into neuralnet
setxy 0 (-20)
pd label [Scanning object] pu
end

to training
; **************** Training Session ********************
cleartext
spaces 20
print [ ====== Training Session =========]
print [] print []
make "q$ [Automatic Training (Y/N)?]
askit :q$
make "q$ uppercase :q$
(if (:q$="Y) [autotrain] [manualtrain])
end

to manualtrain
end

to autotrain
; '********** Reach here to begin a training session. *********
cleartext
;' Train the neural net on vertical vs. horizontal lines
print [Note: It takes a while to scan each object, but more ]
print [ojects mean more accurate classification.]
print []
make "class 1
(rerandom 5)
type [How many objects of Class One?] spaces 1
make "n# readword
autoone 1
print [] print [] print []
make "class 2
(rerandom 5)
type [How many objects of Class Two?] spaces 1
make "n# readword
autotwo 1
end

---------------------------------------------------------------
Please post messages to the Logo forum to logo-l@gsn.org. Mail
questions about the list administration to logofdn@gsn.org. To
unsubscribe send unsubscribe logo-l to majordomo@gsn.org.

  * *Next message:* Dorothy Fitch: "LOGO-L> Logo PLUS arrays" [9]
  * *Previous message:* Keith A. Gum 904-628-2953: "LOGO-L> message test" [10]
  * *Next in thread:* KERRB@Magill.UniSA.edu.au: "Re: LOGO-L> A Perceptron"
    [11]

----------
Site notes:
  [1] http://gsn.org/gsn/gsn.home.html
  [2] date.html#5
  [3] index.html#5
  [4] subject.html#5
  [5] author.html#5
  [6] 0006.html
  [7] 0004.html
  [8] 0011.html
  [9] 0006.html
  [10] 0004.html
  [11] 0011.html