MENS CEST hate ed vy Lis hit Bl Lh Wh Tc hl | Dil ste areas IM EN ONTO EA ION BAD LOS Th eV PA belie a a PCa La ANON he tay WY Che penny. TOR ry 1. my WA ne hee i ¥ we POOR ENT VDA DS Pa Meta VE DKA R SSDS SIE Bait Mit KW! Hb TEYASN VED oad yh Waly MY TVW NA dM tals iL tial ew BA ‘ S . ! > R f APs WS Bho lg fa Korve alg ‘ it SAE NOAVE YOR Oe oh ota ret MAMA ph Meet phy PAS An Pri rat on rh Me ty Dear erty wiitin Ty te VEK ls VEN. type aa ey atta ttl wh ras ren " La atthe tind POM NNO Wey Rm anO rin en ore 5 ; LLP ML 2A YES SEN AVR WRN Cb y An by ered reer g PV NEE Cuabiustoneenves Ida ead Shops eh SMA ew TL ATR M Da CATE SEAN AN MELT, aS RIN MONT PAE y WAY Te oom enn Vern HOP He MEM 2iEWhe be pt TN eh a ce a at Poot SOP oer vey NVI ea ys AV Ca DY Ma Cees se ale ili, Gat marae tall ews rad UL) INAS e vite nedy yy AADAC ADS SAWN ¥IL OEM UA LURE AMG eral DyT Qe B AUN NSA Chy Thar Ye ge! Ay F y De SDN yea Ve Bia Poe I BEAM ify Mea aN g Mi fa rs eee eC) aay ey Lah od thay, py La ie T rity er POLE BLUE Ey riewets (ues Malbah ys VME Vaan vve% PREnwEeny AYMLAW Merny nce nne aan y AeaTelsin hy vtLanstaa atalitaty arin yh a be Nese gral eee mpi nt adie cine ahs Sie aa LN SAME A HAM A AAD AS VINE IGN Pd Waar Beene L Ag ot Be My Phe Ya Ue SNe HiEMP Ah de yl sMytey THE rae ha yA ¥iMM yb aM Sara IAM yd anna A A esi viseis ath retry pi He a terre nee Ate be hast by ‘ ie Ya VAL GUA NA PIM BOI HEIWAD #5) NES ANNE ANS Ay MSDs Ve WH IME MBs ath aa sf feds nite antan Nala ta 2 etatorlarhe AY Ne Wey OR Une OTT Ete eat ms se ay “y eye oby oes eh UE MUTA AY Oa Gehiowens Met en eR ee MCE MELEE SOLE Ly MMOD Get # Heth; tad Se Wefe ren F i Site ARES SLES RO ane, SYA Gy EAB MUS Ah ate © A vdietic ta hte Net yo Tatlall ewe A Ud ree wile Mw Nth hi Ayatiatetid MRE DN ITT MEO OTS, ny POUR UR a) tywordid Hoe ie irene baa Men EWEN. MRO IT TAMER Ally OMNI tOL OU enentie ao mat TTA SAN Fe yee ay tet, Oe ioe fo Mates he eiaerry ed etieti tad YT it wing 7 ELI PoP ye ENTS ed Lye eee po Crttee QRS Aa of Faure fou tt Rear Ty RW ee YOU AD ph ene Naya aioe, : SH A Sed ed 5 ie SDT eer her re eas Katie net cet eT MNGE DLL Cate Gen OD alate Mag ladles eS WK Ly seONneretsy eri ‘et ; . DEVAL EE EY ERISA LY ONE INR ALOU ALY Mt MC or ERNE 5 ISSO NT gyi (WEN Ae ST Se ne Te Shy trate te bete tia Gamat: SIAL rmsSl TH Teast Hive dae Hee PE eee LAME IN Goplow ne Cat ad el ctielesioha hey tied ky obit Went LO eOt pe SEY OS ote EY Li aM titey oe fatlad emely PRN Hey AYO M iy Nad Aner aie gt fice Ayan [lm ete Ph Datel Hat eA bahd atid a Na a SRT BE es PLA DEIR Mee he UAE aU Bef Vatte ay in wthals Cig ted ee Pen Wen Deana UN OMAR Spey figs t. Ved VIP MY, ‘ Ly Leis able t H I . NEVA Wadley 2% SO STII a Ea ah ot i Zahn ml ilba Sims toe ae sd vo eEN est ‘ De Trae ay none rates Sere erento en Whine ets Sor ouieR Whe beh al ‘ € AVP 0 LOIN ADA ED I Moe ES EMISES COE INS BOR Md DS ek a ee ee ed Fave si ab PDE eM Ne Bye eh TL es a ESL SRI TEEPE ART MC oh MW NIEE G Sori t eg \letyy ey ar eityd Vit Ware did Soot ot Ep ene c Sanus j ON NE ale eve oe CE RY Pel a het WEY Went dite @® ome Wat Mab UM FF Nw Ve te we ra Pe Re atin eae MP et cash ds afte at th SPAN ee eR ee INCA ANGE CAEN EYS RIDER LAY Coty ONY PEE MW Ber TLE VVC TM LU Ry Sree FO SBA Lyte Mona ate eae y ce NMS OW Sey OP Mee : ‘ Nerd Charen Tae) te Wa Dagan PEMNAN EG NN AY ag AONE by OC nde By ond el tite DL ed VI IN eet ew VENETO OH tN ey cyt fe tiny freee Z VE SINE Enh eres # eo Ohay UN COTY OES ENOL Sone ts BAH rary Al mivigih i WWI eS EE VIE IN SANE Tp stor Med © De Nge Sten taths “elvan are ons NOt A III A a See pap et vei en wwe Sarre AY INOS HPs eet ye MANNS oye WAS AO Wee , sheet ; i rina ns Meath VAN CoA heues ab Aw bt a Tt MVS, Sa arhy Ot a years SVN Ny Danby at 5 ir “ : : P # Ne Day fip AE abana vb) ye sey et ay adil goo apaty Pee Re th ns Witte Le henr ra Te ee We Ae wren eaety a iy atti “ i val! F peta Crean era) A0ey ru Sante ane? ae : Cy INSa Ke he mye H v7 id ee! ' ies ee ea Re Dan 0 ea LY AUD KE DADS wPom gga Warr A ASE MAE NP wees Wika Sere ‘Bein garagyteenges Sawega omer ytgey funds devant daewas TT Sa fate as rane Merah eh Ce ieanscdagdane ben unytan x on-enaunadarmsiotonrniNoegem vamos RiRass ek Lo RTE ee te dg tee Saw tye Mi) oy ww wea UNA n ' . vv yet eae PLB bia RN ee Arn EARN OWI Darniey AbD toe oaths wing vey Malte why a bgp, Bact Borat al el Tote HN D Sob MEW a wid Cun liWe BU gia otET YEMEN BM OEE EAD AES LDN INT IO NV WV a Peers Hote MAY od lik LST HNDM UDB GMANG, Meaty VEL ot Lumen te Fe 0 toe thule Mey emi nh shai, oe eg Cat ONY bn ee Ce ee LOR eet ee ed TT OT me Ride Dd SCC Oe) De Te ek i LECH LNCS CP eat OS dN AP UPR ae SMR MERE ET IN OYE SIR OI Me Lab Y® Vet he ate OPE NON eh Lae gh tt TO Tae : PEN ad eS BEES EN IE Ay te UE eh ND ONE TY oe DT SANUS # Wet ee TINA Cutt yt Ot UI EN ORY Md et eth ete g tp eee SEER gm re ey i atthe (mids AM aan CY ONENESS ws nahi NYatiate wiht w” alsbag nyt n HATA NI BUS W WLW IG ftw, fo Nh Ha : Q DE DNC eae net ome ee et ivy Ae ANY Ct Coe eum ye ne” SLE APRN AV EES, Sea V ANNES LW Lee Ty MANE NENT TCT IN Gated Sa ELON, ry . Nee) testy Wad at rode bey AWN EE pep hy EN Re Np be Fina he ‘ Voorn sh gf pV at DN RORY YN A VOR Sua Re ae NOLES OD oe by A Oh et ‘ vot CSAWHIEL OF Mey renee WS se le 7 eet ra Father WOU PE Fes v ort NP ANP IN AE eA WINN DDI IMA yd dae CFR AMD Sy OAS Sd OL ee OO ee Pfs ey va fa EMI traiaAbig halve hayeg i Vai NBs altro dap Bete h wtp ia Gunien tens Pe WEEN NEMA Style AF ICOM EM WH AA OR SIPLE THY vg eed VL TU VE MALONE E ENE NEMS WIM PET aval eVatorrbe ed coreg oes RHE QUAD TAS VRERYS Y errant SN verninnl owner leit red™ see hen CMe CANE Mee tig tee eS Woon Teen ha eee ee eT ty AON eee 2 RPC LF NE dE OA SANCTUS Vd De Aes EE NCHS Bee iROe Bye Huey rrteye AVDA OTA Medica gg tite Le ade PON ER EAE " SAMO Ag 14 P89 EPIL TD Pomp hare Abe DVINOP HEN ARs g Wy a Rap ee Arr eres Lael eS y rary Ye NPY SU aS 2 oy my Veer eouy ky te vey ie eny Aaa ee eee ne he re a ren ces eRe rrebey wha ; Awha ths COMM TENA TSAO NS diy tia ee BAYA 7 we VEINS Ine POLL ALVES INCOR E RES Mags ey Hoyer ieeaye ANP NWT Ld ’ \t > N , : oy ce FONSI Wet Ye SSN So arene ry PA ated Cera ak a see SUP ieee a et eee ey Oro Uy 1H HN dog Me SAMAR Woe fg ie Low eave y C eee Tristat gt er, a a ey Coe ea ern avai otiord sty SOLU esate NE ENE Lo ANS A MEN AV ERIN EN HANAN Mow Vey INS FYE as ach hi Si er ee FEU Ye SY wl oy CRT Coen e a AW TIN NEVE LIM DN Sy ay ha AON Uae ee aes RUSE EAT en] Oe ee hee ee Or APAS ST en it ANS Bete ee, Srenrait ie t Tee mev IN Oper ed Ve pee ates ee elena PET Vy ety eee Mhee NUNC OB Voie ometig het yy Mav tee ae SPARE Wat pet Cee thel Oe Ft nen one CONT he we i WW Mp a9 Seouhers betta dtl warn ree ee ee OT een 1 ONES HP ON t, y aces Mesut wikigg C25 meee We eR hat Dwth er aes rate » ‘ SET Vb Te Na ely ee WOM WAT Ny bal bia SY wi MPa! eaten bys aw She oer beat y : “re ERC feel pa bp oll ner AAS ergs 2 oS b yan ANY SPEYER ET PON be SORE ms Paper tencpenbis PEAS Mey PORTIA ve nba ESI br manatee bat Ll iwihat RmAa ete Panett anne wie tn dra Htafin Wiss Veute¥ ny ove2 Padi Am eV MSU NET Ob ASOSMIVN LIP Ales Pend f WES VERDE Nay he we A ered bitin! WLR EY Ng ns Mie Seated va EN TAY LDA Ud Mill ag Ynvtses 1a Yang Te Dg MaPS gala Ny ty" WENO tee ve arerny Fete at Wt Ye ee WA fy eee eed ede Huten in “yu i, * h; ‘ of AVIS Mt LVDS ET ot HOPING og pe ehe bs pang MEME ATCT FN SC MALY Ah thd eh al, \ % 1a Meese he! erat rn ORO LMR acon LADEN mehr MPC MENE LIED TS ed AEM EERE AN SINAN hy en ye ke FREAD OMAN le ket Me EN te nial nbn atte thy og tot Vy Tenge Ne wtiigita ley CV LAE OPH TV ND AD ate SEPLEN Gao WEN OY? Uwe ware wey rvae eee DIAC YN ANY ete Park Sen Ae RPT (es INYS OSG edo edies Se eer ee at We ee edie dy rer at un owt Pena Pratik rene ene fatty ayn, WS ed LA OF ee heal 5a VAD IY ANB VB hy CPI ls hae PINON DV de ow Oe ne s, belay . b Ay PD vow Very WN em a Bo FM Le ti sl her CNP ONERENS SIOPY Ree ATEN AICI PMN rey ANTE BE ba ATE Meat e¥DOLIODM GINS Wa tincar em tienes NSF nay Nba Sone ee Ha tiety PRM TMA NE Oot ety GENE eed D PN VOY MUSE Oe Ny Dew es " “ . 3 ‘ eu Feb FANN 1 1t sa ’ . rat Pon WSa eyalcruer Chigse Ms ORME Dt Le ttent ye Pat ed ded DN . ' fp IN eigen e lh wh si Seeenraalvat aging WL Me Mette MN Pe Tete EUS ay ASN ONDA Yells pues De ¢ a ee 2 PON hy vad ep U WM vee is a al err Te it Ase ty aw ae ar SORT See he rie NOt ea bed Ry ais Ni) vie) “7.4ih eet. vi a f my NDS va NY ak eI an yer, vente Th oe eet Vit bs SEER Rear y i beable ei se Ty an fe Pow “ WAN rime it NW TRAMs bd very Pra Lee Ek ee sytem hittin NNN ee UV ASING sy Me bye PEN bay Se Sete PhO Peony ero Yom eRe HE’ (My Mas ptt Pov Dag 9% pom entnn ADM ako be Tree 94 antares ey hid MTA et re) PENT Wee Ae ANF aA add ew vogt yore IETS 6 LOTRE LOU ge ieee ae ee venents Semen Feit wens STIL tty eed el DMG yes wie tafe COeeny HE HEM NO aS, ea Ve bes ey sinave > ving! Ne at aL ta AALS de eee a mad BEE ets open oy caret nalts ee Ree Hh ee weedy ae Ontos Sr ond my. a ys Wa tp Nath Se bttats Woe wow tye r ; . ; ‘ Sa ee VB beg yeururn onew “ othe he SIR ot aiuerage we a ed Wn Peete DS ee Meaty wre ie" Ce ee Fp g Nt AE (SM arto abipatte vals ENS why owy TIEN INOS vee eee eh ae Seg ate ar 7 tye ded : ; Celtet pe a ee oan ny Nested heehee . ys etetyy | te aie Ra iog’ he Te thant as oa Feeney ovy oy beg ts ndied el Neebbaly whe nth N vlgy! vy HUN mia be 8 eM elige ioe al any amet te owas WSS Mgr d ut Pe ihe Sane we SEN wy hy tyeany mete aN ee erin ne Ve INLAY ne Son atiee! a Neate Veit Pe id Were diy eS et dae MMeace Minette were us oy waren a ae Fae rea vaca ab Abamun tae ei ette nf io oe Ce... 1) as) ft 7 WN + ; VOLUME 81 Number 1 Jour nal of the aes , March, 1991 WASHINGTON ACADEMY... SCIENCES ISSN 0043-0439 Issued Quarterly at Washington, D.C. Nov 021998 LIBRARIES CONTENTS Articles: MAXWELL H. MILLER, JEFFREY L. HARPSTER, & JAMES H. HOWARD, JR., “An Artificial Neural-Network Simulation of Auditory Intensity Perception and Profile Analysis” i ey WILLIAM B. TAYLOR & FREDERICK J. EDESKUTY, “Evaluation of St. Lucia’s Geothermal Resource” CC “CORRIGENDUM” i JOHN J. O'HARE, “Perceptual Integration” ee ie s .ehe! 070 lene 6) 0s 00] eee © ).0)\0 ©. 0,0 06 «ase. e| Washington Academy of Sciences Founded in 1898 EXECUTIVE COMMITTEE President Armand B. Weiss President-Elect Walter E. Boek Secretary F. K. Mostofi Treasurer Norman Doctor Past President Robert H. McCracken Vice President, Membership Affairs Marie Bourgeois Vice President, Administrative Affairs Grover C. Sherlin Vice President, Junior Academy Affairs Marylin F. Krupsaw Vice President, Affiliate Affairs Edith L. R. Corliss Board of Managers ~R. Clifton Bailey Jean K. Boek © James W. Harr Betty Jane Long Thomas N. Pyke T. Dale Stewart REPRESENTATIVES FROM AFFILIATED SOCIETIES Delegates are listed on inside rear cover of each Journal. ACADEMY OFFICE 1101 N. Highland Street Arlington, VA 22201 Phone: (703) 527-4800 EDITORIAL BOARD Editor: John J. O'Hare, CAE-Link Corpora- tion Associate Editors: Albert G. Gluckman, University of Maryland Marc Rothenberg, Smithsonian Insti- tution Marc M. Sebrechts, Catholic Univer- sity of America Edward J. Wegman, George Mason University The Journal This journal, the official organ of the Washing- ton Academy of Sciences, publishes original scientific research, critical reviews, historical articles, proceedings of scholarly meetings of its affiliated societies, reports of the Academy, and other items of interest to Academy members. The Journal appears four times a year (March, June, September, and De- | cember). The December issue contains a di- rectory of the current membership of the Academy. Subscription Rates Members, fellows, and life members in good standing receive the Journal without charge. Subscriptions are available on a calendar year basis, payable in advance. Payment must be made in U.S. currency at the following rates: Claims for Missing Issues Claims will not be allowed if received more ~ than 60 days after the day of mailing plus time normally required for postal delivery and claim. No claims will be allowed because of failure to notify the Academy of a change of address. Notification of Change of Address Address changes should be sent promptly to the Academy office. Such notifications should show both old and new addresses and zip-code numbers, where applicable. Published quarterly in March, June, September, and December of each year by the Washington Academy of Sciences, 1101 N. Highland Street, Arlington, VA 22201. Second-class postage paid at Arlington, VA, and additional mailing offices. Journal of the Washington Academy of Sciences, Volume 81, Number 1, Pages 1-21, March 1991 An Artificial Neural-Network Simulation of Auditory Intensity Perception and Profile Analysis Maxwell H. Miller, Jeffrey L. Harpster, & James H. Howard, Jr. Department of Psychology, The Catholic University of America ABSTRACT This paper describes a computer simulation of human auditory intensity discrimination. There are currently two different views of how intensity discrimination is carried out by human listeners. The traditional view holds that a listener successively compares the acoustic energy between two sounds and selects the louder of the two. A more recent view, called profile analysis, suggests that a listener simultaneously compares the spectral profile within each sound individually. In other words, the timbre or the perceived spectral shape of the sound is also considered. The computer simulations replicated a study by Green, Kidd, & Picardi (1983) in which the interval between the sounds was varied. Results from the simula- tions are consistent with results obtained from human data. Introduction In the past five years there has been a growing interest in a class of adaptive computer models known as artificial neural networks. The discipline dedicated to the study of neural networks is a multidisciplinary field which involves re- searchers from diverse backgrounds including computer science, neuroscience, cognitive science and psychology. The application of neural networks is - uniquely defined within each discipline. For example, computer scientists are interested in applying neural networks as a technology to solve difficult engineer- ing problems encountered in robotics, computer-assisted pattern recognition and artificial intelligence. Neuroscientists are principally interested in building computational models of neurophysiological systems, and artificial neural net- works provide them with a useful tool for modeling. Cognitive science and psychology have embraced this technology as a theoreti- cal framework to help explain, understand, and predict human performance. Symbolic models such as expert systems have enjoyed much success when ap- 2 MILLER, HARPSTER, AND HOWARD plied to rule-based decision making problems. However, these same symbolic models have failed abysmally when confronted with perceptual pattern recogni- tion and classification problems—the very problems at which artificial neural networks excel. The neural network paradigm offers a powerful alternative to the traditional symbolic models so widely applied in the artificial intelligence and cognitive psychology disciplines. A defining characteristic of artificial neural networks is their parallel architec- ture. Unlike conventional digital computers, which have a single central process- ing unit executing instructions sequentially, artificial neural network models have many simple processors acting together concurrently. Each processing element is capable of computing only a few very simple logical or algebraic operations. The real strength of neural network models comes from the interac- tion of large assemblies of processing elements acting together in parallel. Processing elements are arranged in layers. The connections between process- ing elements are assigned numerical values which are referred to as weights. The architecture of a neural network is tailored to each individual application or problem. The simplest neural network architecture consists of two layers of processing units. More complex architectures support three or more layers. Due to the fact that computers with true parallel architectures—sometimes referred to as neurocomputers—are not yet widely available, neural network simulations of parallel architectures are carried out on digital computers with conventional von Neumann architectures. Another important distinction between the processing in neural networks as compared with conventional computing is the way that informatiom is repre- sented. In neural networks, knowledge is not stored locally or associated with an address in memory. Rather, concepts are represented implicitly by a pattern of activation over a large number of processing units. Information, or knowledge is encoded in the connections themselves, and not in the processing elements which serve only as computing devices. This results in a system which is very fault tolerant, and degrades gracefully if a subset of processing units fails or if the input data become corrupted or degraded by noise. Perhaps the most remarkable aspect of neural networks is their ability to learn by example, without being programmed by a human. Through the application of an iterative training algorithm, a neural network can learn to associate one set of patterns with another set of patterns. In essence, a function is computed by the network which maps a set of input patterns onto an output pattern. The connection weights of a neural network implement this mapping. The proper set of weights to compute this function is usually never known a priori. So various statistical training algorithms have been developed to adjust the weights. Perceptual psychologists have become interested in experimenting with ARTIFICIAL NEURAL NETWORK | 3 neural networks for several reasons. First, due to the fact that the parallel archi- tectures of neural networks lend themselves to perceptual pattern recognition problems, they provide psychologists with a superior computational model of the underlying perceptual processes. Second, parallel models not only provide a better theory of the processes, their architecture is based on gross biological principles understood about the brain. And third, neural networks can learn by being trained, can adapt to changing input parameters, and can generalize to novel patterns never seen before. This is one of the cornerstones of human intelligence and performance, and has been neglected in the past with the statis- tical models so pervasive in the psychophysical literature. In sum, neural net- works are able to demonstrate the type of flexible and adaptive performance which conventional symbolic models lack, while also accounting for perceptual learning (Clark, 1989). In the present research program an artificial neural network was applied to a classic problem studied by perceptual psychologists for over one hundred years; How can we explain the ability of a listener to discriminate a difference in intensity between two simple sounds? In a typical intensity-discrimination ex- periment, two sounds are presented successively, separated by a brief intersti- mulus interval. Both sounds contain a pure tone (known as the standard) and one contains the standard with a small increment (known as the signal). The observer is forced to decide which sound pattern contains the standard plus signal. The traditional explanation given to account for this phenomenon states that a listener performs the task by choosing the tone with the greater acoustic energy. In contrast to the traditional view is a more current view of intensity discrimi- nation and signal detection which David Green and his colleagues have referred to as auditory profile analysis (Green, 1988). The stimuli used in profile analysis experiments are complex broadband sounds, in contrast to the pure-tones typ1- _ cally used in psychoacoustic experiments. In a profile analysis task, the sound with the signal component may have less overall energy than the standard; hence, intensity discrimination will not work. The judgment of whether a signal is present or absent must be made by considering the internal shape of the spectrum as opposed to comparing the differences in energy between sounds. Despite the fact that this phenomenon has been studied so extensively, there is still not a comprehensive theory of intensity perception and spectral shape dis- crimination which can account for the entire body of empirical data. This pro- vided the motivation for performing the computer simulations in the present study. In this study an artificial neural network was trained in both an intensity discrimination and profile analysis task. The independent variable of interest - MILLER, HARPSTER, AND HOWARD was the duration of the interstimulus interval (ISI) between the presentation of each sound pattern in a two-alternative forced choice task. The computer simu- lations carried out here replicated a study originally done by Green, Kidd, & Picardi (1983). Four separate computer simulations were performed, each of which corresponded to one of the four experimental conditions in the original Green et al. study. Artificial Neural Networks The early roots of artificial neural network models were as simple pattern associators. The first neural networks widely discussed in the scientific literature were known as “‘Perceptrons,” a name coined by Frank Rosenblatt in the late 1950’s. Perceptron models were often applied to pattern-recognition and classifi- cation problems. Through training with an iterative learning algorithm, percep- trons could learn to associate a set of input patterns with an output pattern which coded category membership. Initially, perceptrons held much promise for modeling perceptual pattern recognition processes. Rosenblatt proved an important mathematical theorem, known as the perceptron convergence theorem (Rosenblatt, 1962). This theorem guaranteed that if a set of input patterns are learnable by a perceptron, then this learning procedure would converge on a-set of connection weights which enable a perceptron to adequately represent the problem. This proof was an important contribution to both the machine learning field in engineering and learning theory in psychology. However, as researchers began to experiment with perceptrons it soon be- came apparent that there was a class of problems that a simple linear perceptron could not solve. A severe limitation placed on the perceptron by the learning algorithm enabled it to adjust only one layer of adaptive weights. As a result, the perceptron could only solve problems which were linearly separable, meaning that a perceptron could only perform a linear mapping between the set of input patterns and output or “target” pattern. This limitation was formally stated in a book called “‘Perceptrons” written by Marvin Minsky and Seymour Papert (1969). The classic example given in ““Perceptrons”’ was the exclusive-or (XOR) logic operation, which a perceptron could not compute. As a result of Minsky and Papert’s critical evaluation of the perceptron model, coupled with its inher- ent limitations in the types of problems which could be represented, research with neural networks died out in the late 1960’s. In the past five years there has been a resurgence of interest in adaptive neural network models. This is due to the fact that a powerful new learning algorithm has been discovered which allows neural networks composed of multiple layers of adaptive weights to be trained (Rumelhart, Hinton, & McClelland, 1986). ARTIFICIAL NEURAL NETWORK 5 Multi-layer networks with three or more layers incorporate a middle layer of ‘hidden’ units between the input and output layers. By adding an additional hidden layer, the neural network can recode the input patterns into a higher- order internal representation. The multi-layer networks have been able to solve many of the problems that a simple linear perceptron was unable to solve previ- ously, such as the XOR problem. Network architecture. The architecture of the neural networks used in the present simulations is illustrated in Figure |. A two-layer fully-interconnected feed-forward artificial neural network was used. There are two layers of process- ing elements, an input and an output. All input units are fully-connected to the output unit. There is no feedback from the output unit back to the inputs; activation can only flow forward. Only two layers of processing elements were used in the simulations discussed in this study, as a multi-layer network with hidden units was not needed to perform the task. More complex architectures often support recurrent connec- tions where activation is passed-backwards from the output layer to the input layer, or laterally to other units within a layer. Connections between non-adja- cent layers are also possible. The connection weights themselves are stored in a weight matrix. There is an additional weight associated with each output unit called a bias. The bias value may be thought of as threshold term that influences the amount of input needed to elicit a response by that particular unit. The bias values are stored in the weight matrix as well. Network output function. The transfer function for each neuron is known as an activation function. The activation function defines an input to output rela- tionship for a processing element by establishing an output value for a given input value. The output for any unit O; is a non-linear function of the weighted sum of its inputs plus a threshold or bias value where W,; is the strength of the connection between units i and j and B,; is the bias value or threshold for unit j. S(x) is a nonlinear squashing function which re- maps the sum of inputs into the range 0.0 to 1.0. In the present research a sigmoidal squashing function was used S(x) = 1/(1 + exp(—x)). This function “squashes” positive values into the range 0.5-1.0 and negative values into the range 0.0-0.5. As a result of the squashing function, the output elicited from a processing unit will not be at its maximum unless it receives a net positive input greater than its bias value. 6 MILLER, HARPSTER, AND HOWARD Discrimination Response (0.01 =First Sound, 0.99 = Second Sound) 1 Output Node ® O 2 Input Nodes First Sound Second Sound a. Schematic ofa 2-1 feed-forward architecture used to detect the presence of a signal increment to a sinusoid. Discrimination Response (0.01 = First Sound, 0.99 = Second Sound) 1 Output Node 22 Input Nodes First Sound Second Sound b. Schematic of a 22-1 feed-forward architecture used to detect the presence of a signal ina complex sound. Fig. 1. Schematic of 2-1 & 22-1 neural-network architectures trained with the back-propagation learning algorithm. ARTIFICIAL NEURAL NETWORK 7 The learning algorithm. The learning algorithm used to adjust the connec- tion weights of the network during training is the popular back-propagation algorithm. Back-propagation is a training technique which allows multi-layer networks to establish an optimal mapping between input and output units (Rumelhart, Hinton, and McClelland, 1986). Training involves two steps. At the outset all of the connection weights are initialized to small random values. In the first step, input patterns are applied to the network. The input vector is fed forward through the network and an output value is computed. This output value (or observed value) is then compared with a target value, which is the desired output value. If there is a discrepancy be- tween the observed value and the target value then an error signal is generated. The second step in back-propagation involves a backward pass through the network. The error signal is propagated backwards through the network, and each of the weights between the output unit and the input units are adjusted by an amount proportional to the error term. A similar adjustment is made to the bias term. During the testing phase, the connection weights are fixed and cannot be modified. Testing patterns are applied as input and the output or response of the network is measured. In the case of the simulations discussed in this study, the network was only required to perform two-category classification. The task required of the network was to select the interval which contained the signal increment. Stimulus Conditions Two methods were used to select the level of the background or masker components of the stimulus patterns. In each case, the levels of the maskers were sampled randomly from a uniform distribution. However, the critical distinc- tion is whether the level of the maskers remains fixed across trials, or is varied randomly within trials. The first method used a between-trial variation. In this case the amplitudes of the masker components in the first and second sound-intervals are equivalent. The second method used a within-trial variation. The levels of the maskers in a stimulus pair are chosen independently of one another, and the levels of the maskers for the first and second patterns usually differ. Three different sets of stimulus patterns were generated for the between-trial variation method. They are referred to as the single-sinusoid condition, the uniform-spectrum condition, and the multi-component condition. In the sin- gle-sinusoid condition (Figure 2a), there is only one frequency component, the signal increment was always added to this. In the uniform-spectrum condition, (Figure 2b) there are multiple frequency components. The signal in this instance 8 MILLER, HARPSTER, AND HOWARD Frequency a. Single-Sinusoid Condition Spectral Magnitude Frequency Spectral Magnitude b. Uniform-Spectrum Condition ?dmM— M+S | € dageeecen Fi cha £ fe B 3 a Frequency wn Frequency b. Profile-Analysis Condition M+S M Case 1 ¢ M It is not important if the weights are negative values in order to assess the contribution of an individual unit to the overall decision arrived at by the network. The absolute value of the weight is what counts. 18 MILLER, HARPSTER, AND HOWARD 2 s 156 ep le ia bE SG Ne gS - = _ = LL Oo 0 Set ee ee ee LU : CO es ele i = f S She