From prefenes@lbs.ac.uk Sun Feb 23 02:58:41 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id CAA28673 for ; Sun, 23 Feb 1997 02:58:34 -0600 Received: from cs.uwa.oz.au (bilby.cs.uwa.oz.au [130.95.1.11]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id CAA16313 for ; Sun, 23 Feb 1997 02:58:28 -0600 (CST) Received: from (parma.cs.uwa.oz.au [130.95.1.7]) by cs.uwa.oz.au (8.6.8/8.5) with SMTP id PAA01920; Sun, 23 Feb 1997 15:47:56 +0800 Message-Id: <199702230747.PAA01920@cs.uwa.oz.au> From: "Paul Refenes" To: connectionists@cs.cmu.edu, ml.ics.uci.edu@lbs.ac.uk, reinforce@cs.uwa.edu.au CC: PBolland@lbs.ac.uk, Wholt@lbs.ac.uk, jutans@lbs.ac.uk, prefenes@lbs.ac.uk Subject: Preprints available - Neural Networks in the Capital Markets [connectionists] Date: Fri, 21 Feb 1997 19:55:04 BST Neural Networks in the Capital Markets The following NNCM-96 pre-prints are now available on request. Please send your postal address to: boguntula@lbs.lon.ac.uk =================================================== NEURAL MODEL IDENTIFICATION, VARIABLE SELECTION AND MODEL ADEQUACY A-P. N. REFENES, A. D. ZAPRANIS AND J. UTANS Department of Decision Science London Business School Regents Park, London, NW1 4SA, UK In recent years an impressive array of publications have appeared claiming considerable successes of neural networks in modeling financial data but skeptical practitioners and statisticians are still raising the question of whether neural networks really are "a major breakthrough or just a passing fad". A major reason for this is the lack of procedures for performing tests for mispecified models, and tests of statistical significance for the various parameters that have been estimated, which makes it difficult to assess the model's significance and the possibility that any short term successes that are reported might be due to "data mining". In this paper we describe a methodology for neural model identification which facilitates hypothesis testing at two levels: model adequacy and variable significance. The methodology includes a model selection procedure to produce consistent estimators, a variable selection procedure based on variable significance testing and a model adequacy procedure based on residuals analysis. ================================================ SPECIFICATION TESTS FOR NEURAL NETWORKS: A CASE STUDY IN TACTICALASSET ALLOCATION A. D. ZAPRANIS, J. UTANS, A-P. N. REFENES Department of Decision Science London Business School Regents Park, London, NW1 4SA, UK A case study in tactical asset allocation is used to introduce a methodology for neural model identification including model specification and variable selection. Neural models are contrasted to multiple linear regression on the basis of model identification. The results indicate the presence of non-linear relationships between the economic variables and asset class returns. ============================================= From harnad@cogsci.soton.ac.uk Sun Feb 23 13:53:37 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id NAA06109 for ; Sun, 23 Feb 1997 13:53:31 -0600 Received: from TELNET-1.SRV.CS.CMU.EDU (TELNET-1.SRV.CS.CMU.EDU [128.2.254.108]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id NAA21830 for ; Sun, 23 Feb 1997 13:53:27 -0600 (CST) Received: from TELNET-1.SRV.CS.CMU.EDU by telnet-1.srv.cs.CMU.EDU id aa05257; 23 Feb 97 12:41:58 EST Received: from DST.BOLTZ.CS.CMU.EDU by TELNET-1.SRV.CS.CMU.EDU id aa05255; 23 Feb 97 12:23:51 EST Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa08486; 23 Feb 97 12:23:14 EST Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa13268; 23 Feb 97 12:17:13 EST Received: from acacia.sucs.soton.ac.uk by EDRC.CMU.EDU id aa16950; 23 Feb 97 12:16:29 EST Received: from cogsci.soton.ac.uk (neuro.psy.soton.ac.uk [152.78.195.85]) by acacia.sucs.soton.ac.uk (8.8.2/server) with SMTP id RAA12118; Sun, 23 Feb 1997 17:15:37 GMT Received: from cogsci.ecs.soton.ac.uk (cogsci.soton.ac.uk) by cogsci.soton.ac.uk (5.x/SMI-SVR4) id AA02825; Sun, 23 Feb 1997 17:13:28 GMT From: Stevan Harnad Date: Sun, 23 Feb 97 17:11:28 GMT Message-Id: <18205.9702231711@cogsci.ecs.soton.ac.uk> To: Math Psych List , Artificial Life List , cogneuro@ptolemy-ethernet.arc.nasa.gov, Cogniscience Francophone , PSYCOLOQUY Subject: Psycoloquy Call for Papers PSYCOLOQUY CALL FOR PAPERS PSYCOLOQUY is a refereed electronic journal (ISSN 1055-0143) sponsored on an experimental basis by the American Psychological Association and currently estimated to reach a readership of 50,000. PSYCOLOQUY publishes reports of new ideas and findings on which the author wishes to solicit rapid peer feedback, international and interdisciplinary ("Scholarly Skywriting"), in all areas of psychology and its related fields (biobehavioral science, cognitive science, neuroscience, social science, etc.). All contributions are refereed. Target article length should normally not exceed 500 lines [c. 4500 words]. Commentaries and responses should not exceed 200 lines [c. 1800 words]. All target articles, commentaries and responses must have (1) a short abstract (up to 100 words for target articles, shorter for commentaries and responses), (2) an indexable title, (3) the authors' full name(s), institutional address(es) and URL(s). In addition, for target articles only: (4) 6-8 indexable keywords, (5) a separate statement of the authors' rationale for soliciting commentary (e.g., why would commentary be useful and of interest to the field? what kind of commentary do you expect to elicit?) and (6) a list of potential commentators (with their email addresses). All paragraphs should be numbered in articles, commentaries and responses (see format of already published articles in the PSYCOLOQUY archive; line length should be < 80 characters, no hyphenation). It is strongly recommended that all figures be designed so as to be screen-readable ascii. If this is not possible, the provisional solution is the less desirable hybrid one of submitting them as .gif .jpeg .tiff or postscript files (or in some other universally available format) to be printed out locally by readers to supplement the screen-readable text of the article. PSYCOLOQUY also publishes multiple reviews of books in any of the above fields; these should normally be the same length as commentaries, but longer reviews will be considered as well. Book authors should submit a 500-line self-contained Precis of their book, in the format of a target article; if accepted, this will be published in PSYCOLOQUY together with a formal Call for Reviews (of the book, not the Precis). The author's publisher must agree in advance to furnish review copies to the reviewers selected. Authors of accepted manuscripts assign to PSYCOLOQUY the right to publish and distribute their text electronically and to archive and make it permanently retrievable electronically, but they retain the copyright, and after it has appeared in PSYCOLOQUY authors may republish their text in any way they wish -- electronic or print -- as long as they clearly acknowledge PSYCOLOQUY as its original locus of publication. However, except in very special cases, agreed upon in advance, contributions that have already been published or are being considered for publication elsewhere are not eligible to be considered for publication in PSYCOLOQUY, Please submit all material to psyc@pucc.princeton.edu http://www.princeton.edu/~harnad/psyc.html http://cogsci.soton.ac.uk/psyc ftp://ftp.princeton.edu/pub/harnad/Psycoloquy ftp://cogsci.soton.ac.uk/pub/harnad/Psycoloquy gopher://gopher.princeton.edu/11/.libraries/.pujournals news:sci.psychology.journals.psycoloquy ---------------------------------------------------------------------- CRITERIA FOR ACCEPTANCE: To be eligible for publication, a PSYCOLOQUY target article should not only have sufficient conceptual rigor, empirical grounding, and clarity of style, but should also offer a clear rationale for soliciting Commentary. That rationale should be provided in the author's covering letter, together with a list of suggested commentators. A target article can be (i) the report and discussion of empirical research; (ii) an theoretical article that formally models or systematizes a body of research; or (iii) a novel interpretation, synthesis, or critique of existing experimental or theoretical work. Rrticles dealing with social or philosophical aspects of the behavioral and brain sciences are also eligible.. The service of Open Peer Commentary will be primarily devoted to original unpublished manuscripts. However, a recently published book whose contents meet the standards outlined above may also be eligible for Commentary. In such a Multiple Book Review, a comprehensive, 500-line precis by the author is published in advance of the commentaries and the author's response. In rare special cases, Commentary will also be extended to a position paper or an already published article dealing with particularly influential or controversial research. Submission of an article implies that it has not been published or is not being considered for publication elsewhere. Multiple book reviews and previously published articles appear by invitation only. The Associateship and professional readership of PSYCOLOQUY are encouraged to nominate current topics and authors for Commentary. In all the categories described, the decisive consideration for eligibility will be the desirability of Commentary for the submitted material. Controversially simpliciter is not a sufficient criterion for soliciting Commentary: a paper may be controversial simply because it is wrong or weak. Nor is the mere presence of interdisciplinary aspects sufficient: general cybernetic and "organismic" disquisitions are not appropriate for PSYCOLOQUY. Some appropriate rationales for seeking Open Peer Commentary would be that: (1) the material bears in a significant way on some current controversial issues in behavioral and brain sciences; (2) its findings substantively contradict some well-established aspects of current research and theory; (3) it criticizes the findings, practices, or principles of an accepted or influential line of work; (4) it unifies a substantial amount of disparate research; (5) it has important cross-disciplinary ramifications; (6) it introduces an innovative methodology or formalism for consideration by proponents of the established forms; (7) it meaningfully integrates a body of brain and behavioral data; (8) it places a hitherto dissociated area of research into an evolutionary or ecological perspective; etc. In order to assure communication with potential commentators (and readers) from other PSYCOLOQUY specialty areas, all technical terminology must be clearly defined or simplified, and specialized concepts must be fully described. NOTE TO COMMENTATORS: The purpose of the Open Peer Commentary service is to provide a concentrated constructive interaction between author and commentators on a topic judged to be of broad significance to the biobehavioral science community. Commentators should provide substantive criticism, interpretation, and elaboration as well as any pertinent complementary or supplementary material, such as illustrations; all original data will be refereed in order to assure the archival validity of PSYCOLOQUY commentaries. Commentaries and articles should be free of hyperbole and remarks ad hominem. STYLE AND FORMAT FOR ARTICLES AND COMMENTARIES TARGET ARTICLES: should not exceed 500 lines (~4500 words); commentaries should not exceed 200 lines (1800 words), including references. Spelling, capitalization, and punctuation should be consistent within each article and commentary and should follow the style recommended in the latest edition of A Manual of Style, The University of Chicago Press. It may be helpful to examine a recent issue of PSYCOLOQUY. All submissions must include an indexable title, followed by the authors' names in the form preferred for publication, full institutional addresses and electronic mail addresses, a 100-word abstract, and 6-12 keywords. Tables and diagrams should be made screen-readable wherever possible (if unavoidable, printable postscript files may contain the graphics separately). All paragraphs should be numbered, consecutively. No line should exceed 72 characters, and a blank line should separate paragraphs. REFERENCES: Bibliographic citations in the text must include the author's last name and the date of publication and may include page references. Complete bibliographic information for each citation should be included in the list of references. Examples of correct style are: Brown(1973); (Brown 1973); Brown 1973; 1978); (Brown 1973; Jones 1976); (Brown & Jones 1978); (Brown et al. 1978). References should be typed on a separate sheet in alphabetical order in the style of the following examples. Do not abbreviate journal titles. Kupfermann, I. & Weiss, K. (1978) The command neuron concept. Behavioral and Brain Sciences 1:3-39. Dunn, J. (1976) How far do early differences in mother-child relations affect later developments? In: Growing point in ethology, ed. P. P. G. Bateson & R. A. Hinde, Cambridge University Press. Bateson, P. P. G. & Hinde, R. A., eds. (1978) Growing points in ethology, Cambridge University Press. EDITING: PSYCOLOQUY reserves the right to edit and proof all articles and commentaries accepted for publication. Authors of articles will be given the opportunity to review the copy-edited draft. Commentators will be asked to review copy-editing only when changes have been substantial. --------------------------------------------------------------------------- Prof. Stevan Harnad psyc@pucc.princeton.edu Editor, Psycoloquy phone: +44 1703 594-583 fax: +44 1703 593-281 Department of Psychology http://cogsci.soton.ac.uk/psyc University of Southampton http://www.princeton.edu/~harnad/psyc.html Highfield, Southampton ftp://ftp.princeton.edu/pub/harnad/Psycoloquy SO17 1BJ UNITED KINGDOM ftp://cogsci.soton.ac.uk/pub/harnad/Psycoloquy news:sci.psychology.journals.psycoloquy gopher://gopher.princeton.edu/11/.libraries/.pujournals Sponsored by the American Psychological Association (APA) From pazzani@super-pan.ICS.UCI.EDU Sun Feb 23 19:39:17 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id TAA10094 for ; Sun, 23 Feb 1997 19:39:06 -0600 Received: from paris.ics.uci.edu (mmdf@paris.ics.uci.edu [128.195.1.50]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id TAA26623; Sun, 23 Feb 1997 19:39:03 -0600 (CST) Received: from super-pan.ics.uci.edu by paris.ics.uci.edu id aa16668; 23 Feb 97 15:02 PST To: ML-LIST:; Subject: Machine Learning List: Vol. 9, No. 3 Reply-to: ml@ics.uci.edu Date: Sun, 23 Feb 1997 14:39:57 -0800 From: Michael Pazzani Message-ID: <9702231502.aa16668@paris.ics.uci.edu> Machine Learning List: Vol. 9, No. 3 Sunday, February 23, 1997 Contents: CFP: Special Machine Learning issue on applications of ML Student Position: Learning for Agents and Multi-Agent Systems CLNL v4 is here! ECML'97 - Papers & Registration Info Job: machine learning at Boeing Research Programmer Job Opportunity PA EXPO97 UPDATE CFP: AAAI Workshop on Multiagent Learning Job Opportunity for Mathematician/Programmer in Kyoto CFP: IJCAI-97 Workshop on VV & Refinement of AI Systems & Subsystems Does plasticity imply local learning? And other questions Research Scientist for autonomous data analysis IDAMAP-97: Reminder and brief Second CFP 3 positions at NYNEX S&T IJCAI Workshop: FRONTIERS OF INDUCTIVE LOGIC PROGRAMMING Job: machine learning at Boeing AIME'97 Programme & Registration Learning Methods Tutorial -- Washington DC, May 1997 The Machine Learning List is moderated. Contributions should be relevant to the scientific study of machine learning. Mail contributions to ml@ics.uci.edu. Mail requests to be added or deleted to ml-request@ics.uci.edu. Back issues may be obtained from http://www.ics.uci.edu/AI/ML/Machine-Learning.html ---------------------------------------------------------------------- From: Ronny Kohavi Date: Tue, 4 Feb 1997 23:22:29 -0800 Subject: CFP: Special Machine Learning issue on applications of ML This is a short reminder that the submission deadline for the special issue of Machine Learning is in a few weeks. For more information, see http://reality.sgi.com/ronnyk/mljapps/ *** Submission deadline: 4 Mar 1997 Machine Learning Special Issue on Applications of Machine Learning and the Knowledge Discovery Process Guest editors: Ronny Kohavi and Foster Provost With the explosion in size of business and scientific databases (VLDBs), the opportunities and pressure to mine the data and make novel discoveries have increased dramatically. For many problems, basic statistical summaries are not sufficient and there is a clear and recognized need for solutions involving a machine learning component. For example, modern businesses constantly seek to gain competitive advantage by tailoring actions to different customer segments and avoiding the trap of targeting the "average customer." This special issue of the journal Machine Learning will be dedicated to papers describing work in which machine learning technologies have been applied to solve significant real-world problems. In particular, it will focus on the application of Machine Learning technology, the simplifying assumptions that *cannot* be made in a real-world application, and the processes that are involved in going from the raw data to the final knowledge that decision makers seek. Ronny Kohavi and Foster Provost mljapps@postofc.corp.sgi.com ------------------------------ From: Steve Lawrence Date: Wed, 12 Feb 1997 13:04:53 -0500 Subject: Student Position: Learning for Agents and Multi-Agent Systems The NEC Research Institute in Princeton, NJ has an immediate opening for a student research position in the area of learning for agents and multi-agent systems. Candidates must have experience in research and be able to effectively communicate research results. Ideal candidates will have knowledge of one or more machine learning techniques (e.g. neural networks, decision trees, rule based systems, and nearest neighbor techniques), and be proficient in the software implementation of algorithms. NEC Research provides an outstanding research environment with many recognized experts and excellent resources including several multiprocessor machines. Interested applicants should apply by email, mail or fax including their resumes and any specific interests in learning for agents and multi-agent systems to: Dr. C. Lee Giles NEC Research Institute 4 Independence Way Princeton NJ 08540 Phone: (609) 951 2642 Fax: (609) 951 2482 Email: giles@research.nj.nec.com ------------------------------ From: Russell Greiner Date: Mon, 3 Feb 1997 18:56:37 -0500 (EST) Subject: CLNL v4 is here! We are pleased to announce that the book "Computational Learning Theory and Natural Learning Systems Volume IV: Making Learning Systems Practical" (ed. Russell Greiner, Thomas Petsche, and Stephen Jose Hanson) is now available from MIT Press; see http://www-mitpress.mit.edu/mitp/recent-books/comp/greop.html for details. Cheers, Russ Greiner ------------------------------ From: Gerhard Widmer Date: Mon, 10 Feb 1997 18:00:33 +0100 (MET) Subject: ECML'97 - Papers & Registration Info NINTH EUROPEAN CONFERENCE ON MACHINE LEARNING (ECML-97) Prague, Czech Republic, April 23-26 1997 ****************************************************** ECML'97: LIST OF ACCEPTED PAPERS and REGISTRATION INFO ****************************************************** The list of accepted papers, INCLUDING ALL ABSTRACTS, is now available from the ECML-97 WWW home page: http://is.vse.cz/ecml97/home.html This page also gives access to - the 4 post-conference ECML/MLNet WORKSHOPS and - ECML-97 REGISTRATION INFORMATION and the ECML REGISTRATION FORM. - A preliminary version of the CONFERENCE PROGRAMME will be available soon. For further questions about the program, contact Gerhard Widmer at gerhard@ai.univie.ac.at, for questions regarding registration, contact the local organizers at actionm@cuni.cz. For those without access to the WWW, please find below - titles and contact addresses for the 4 MLNet workshops, - the list of papers (w/o abstracts), - an ascii version of the registration form. ECML / MLNet WORKSHOPS (Saturday, April 26): WS 1: Data-Driven Learning of Natural Language Processing Tasks Contact: Walter Daelemans, P.O. BOX 90153, NL-5000 LE Tilburg, The Netherlands. Tel: +31 13 4663070, Fax: +31 13 4663110, E-mail: walter.daelemans@kub.nl WS1 WWW Page: http://www.cs.unimaas.nl/ecml97/ WS 2: Case-Based Learning: Beyond Classification of Feature Vectors Contact: Dietrich Wettschereck, GMD, FIT.KI, Schloss Birlinghoven, 53754 Sankt Augustin, Germany Tel: +49-2241-14-2097, Fax: +49-2241-14-2072, E-mail: dietrich.wettschereck@gmd.de WS2 WWW Page: http://nathan.gmd.de/persons/dietrich.wettschereck/ecmlws.html WS 3: Learning in Dynamically Changing Domains: Theory Revision and Context Dependence Issues Contact: Gholamreza Nakhaeizadeh, Research Center of Damiler-Benz AG, Ulm, Germany E-mail: nakhaeizadeh@dbag.ulm.DaimlerBenz.COM WS3 WWW Page: http://www.amsta.leeds.ac.uk/statistics/ecml97/dyn.htm WS 4: Machine Learning and Human-Agent Interaction Contact: Michael Kaiser, Institute for Real-Time Computer Systems & Robotics University of Karlsruhe, Kaiserstrasse 12, D-76128 Karlsruhe, Germany E-Mail: kaiser@ira.uka.de WS4 WWW Page: http://wwwipr.ira.uka.de/events/hai97/ Common dates for all workshops: Deadline for submissions: February 15 Notification of acceptance: March 8 Camera-ready copy due: April 1 PAPERS ACCEPTED FOR PRESENTATION AT ECML'97: INVITED TALKS / PAPERS: Learning Complex Probabilistic Models (tentative title) Stuart J. Russell, University of California, Berkeley, USA Constructing and Sharing Perceptual Distinctions Luc Steels, Free University of Brussels (VUB) and Sony Computer Science Laboratory, Paris On Prediction by Data Compression Paul Vitanyi, CWI, Amsterdam Ming Li, City University of Hong Kong LONG TALKS/PAPERS: Induction of Feature Terms with INDIE Eva Armengol & Enric Plaza, IIIA, Barcelona, Spain Integrated Learning and Planning Based on Truncating Temporal Differences Pawel Cichosz, Warsaw University of Technology, Warsaw, Poland Theta-subsumption for Structural Matching Luc De Raedt, Katholieke Universiteit Leuven, Belgium Peter Idestam-Almquist, Stockholm University, Sweden Gunther Sablon, Katholieke Universiteit Leuven, Belgium Constructing Intermediate Concepts by Decomposition of Real Functions Janez Demsar, Blaz Zupan, Marko Bohanec, Ivan Bratko University of Ljubljana and Jozef Stefan Institute, Ljubljana, Slovenia Conditions for Occam's Razor Applicability and Noise Elimination Dragan Gamberger, Rudjer Boskovic Institute, Zagreb, Croatia Nada Lavrac, Jozef Stefan Institute, Ljubljana, Slovenia Learning Different Types of New Attributes by Combining the Neural Network and Iterative Attribute Construction Yuh-Jyh Hu, University of California, Irvine, USA Finite-Element Methods with Local Triangulation Refinement for Continuous Reinforcement Learning Problems Remi Munos, CEMAGREF, Antony, France Compression-based Pruning of Decision Lists Bernhard Pfahringer, University of Waikato, New Zealand NeuroLinear: A System for Extracting Oblique Decision Rules from Neural Networks Rudy Setiono & Huan Liu, National University of Singapore Model Combination in the Multiple-data-batches Scenario Kai Ming Ting, University of Waikato, New Zealand Boon Toh Low, Chinese University of Hong Kong Natural Ideal Operators in Inductive Logic Programming Fabien Torre & Celine Rouveirol, LRI, Paris, France Ibots Learn Genuine Team Solutions Cristina Versino & Luca Maria Gambardella, IDSIA, Switzerland Global Data Analysis and the Fragmentation Problem in Decision Tree Induction Ricardo Vilalta, Gunnar Blix, Larry Rendell, University of Illinois at Urbana-Champaign, USA SHORT TALK/PAPERS: Exploiting Qualitative Knowledge to Enhance Skill Acquisition Cristina Baroglio, Universita di Torino, Italy Classification by Voting Feature Intervals G"ulsen Demir"oz & H. Altay G"uvenir, Bilkent University, Ankara, Turkey Metrics on Terms and Clauses Alan Hutchinson, King's College, London, UK Learning When Negative Examples Abound Miroslav Kubat, Robert Holte, Stan Matwin, University of Ottawa, Canada A Model for Generalization based on Confirmatory Induction Nicolas Lachiche, INRIA Looraine, France Pierre Marquis, Universite d'Artois, France Learning Linear Constraints in Inductive Logic Programming Lionel Martin & Christel Vrain, Universite d'Orleans, France Inductive Genetic Programming with Decision Trees Nikolay Nikolaev, American University in Bulgaria Vanyo Slavov, New Bulgarian University, Sofia, Bulgaria Parallel and Distributed Search for Structure in Multivariate Time Series Tim Oates, Matthew Schmill, Paul Cohen University of Massachusetts, Amherst, USA Probabilistic Incremental Program Evolution: Stochastic Search Through Program Space Rafal Salustowicz & J"urgen Schmidhuber, IDSIA, Switzerland The GRG Knowledge Discovery System: Design Principles and Architectural Overview Ning Shan, Macro International Inc., Calverton, MD, USA Howard Hamilton & Nick Cercone, University of Regina, Canada Learning and Exploitation do not Conflict under Minimax Optimality Csaba Szepesvari, University of Szeged, Hungary Search-based Class Discretization Luis Torgo & Joao Gama, University of Porto, Portugal A Case Study in Loyalty and Satisfaction Research Koen Vanhoof, Josee Bloemer, K. Pauwels Limburgs Universitair Centrum, Belgium REGISTRATION FORM - ECML 97 (The deadline: March 25, 1997) TO BE FAXED (42-2) 6731 0503 OR MAILED Action M Agency, note, please Vrsovicka 68 that after March 1, 1997, 101 00 - Praha 10 the country number (42) Czech Republic will be changed to (420) FILL IN CAPITAL LETTERS, PLEASE last name: first name: Prof./Dr./Mr./Ms. affilliation: university/dept.: street: town: Code: country: phone: fax: e-mail: name of accompanying person(s): date (time) of arrival: date of departure: number of nights: I will attend workshop: 1. 2. 3. 4. (tick, please) ACCOMMODATION: krystal hotel (Conference site) an individual choice up to price per night: Room: single double NAME OF PERSON SHARING THE ROOM: special needs (vegetarian, disabled, etc.): CONFERENCE FEES: BEFORE / AFTER FEBRUARY 20, 1997 CONFERENCE FEE (APRIL 23-25) DM 270.00 / 320.00 MLNet WORKSHOP FEE (APRIL 26) DM 35.00 / 35.00 ACCOMPANYING PERSON FEE DM 80.00 / 100.00 ACCOMMODATION DEPOSIT: DM 150. 00 ACCOMMODATION BALANCE: (NUMBER OF NIGHTS MINUS THE DEPOSIT ) SOCIAL PROGRAM: SIGHTSEEING TOUR OF PRAGUE DM 25. 00 TA FANTASTIKA THEATRE DM 27. 00 TRIP & FAREWELL PARTY DM 65. 00 TOTAL AMOUNT: PAYMENT BY CREDIT CARD: AMEX VISA Master Card / Eurocard JCB Diners club Number: Expire: / Four-numbers code (for amex cards only): / / / / I, the undersigned, give the authorization to the Action M Agency to withdraw from my account the equivalent in Czech Crowns of the total amount of DM Your Signature I agree to withdraw from my credit card the accommodation balance (after March 25) Your Signature PAYMENT BY BANK TRANSFER: Name of the bank Date of payment Your Signature ------------------------------ From: Wei Zhang Date: Mon, 17 Feb 1997 16:50:20 -0800 Subject: Job: machine learning at Boeing **Outstanding Machine Learning Researcher needed** The Boeing Company, the world's largest aerospace company, is actively working research projects in advanced computing technologies including projects involving NASA, FAA, Air Traffic Control, and Global Positioning as well as airplane and manufacturing research. The Research and Technology organization located in Bellevue, Washington, near Seattle, has an open position for a machine learning researcher. We are the primary computing research organization for Boeing and have contributed heavily to both short term technology advances and to long range planning and development. BACKGROUND REQUIRED: Machine Learning, Knowledge Discovery, Data Mining, Statistics, Artificial Intelligence or related field. RESEARCH AREAS: We are developing and applying techniques for data mining and statistical analyses of diverse types of data, including: safety incidents, flight data recorders, reliability, maintenance, manufacturing, and quality assurance data. These are not areas where most large R&D data mining efforts are currently focused. Research areas include data models, data mining algorithms, statistics, and visualization. Issues related to our projects also include pattern recognition, multidimensional time series, and temporal databases. We can achieve major practical impacts in the short-term both at Boeing and in the airline industry, which may result in a safer and more cost-effective air travel industry. A Ph.D. in Computer Science or equivalent experience is highly desirable for the position. We strongly encourage diversity in backgrounds including both academic and industrial experiences. Knowledge of machine learning, statistics, and data mining are important factors. Experience with databases and programming (C/C++, JAVA, and Splus) is desirable. APPLICATION: If you meet the requirements and you are interested, please send your resume via electronic e-mail in plain ASCII format to zhangw@redwood.rt.cs.boeing.com (Wei Zhang). You can also send it via US mail to Wei Zhang The Boeing Company PO Box 3707, MS 7L-66 Seattle, WA 98124-2207 Application deadline is April 30, 1997. The Boeing Company is an equal opportunity employer. ------------------------------ From: Sandi Brummert Date: Thu, 20 Feb 1997 16:21:31 -0500 Subject: Research Programmer Job Opportunity Research Programmer Job Opportunity Empirical Media Corporation Pittsburgh, PA http://www.empirical.com Empirical Media is looking for Research Programmers who are interested in building a real product that is changing the future of the Internet. Our WiseWire web service is the application of intelligent agent technology to information access and delivery. WiseWire's advanced algorithms in machine learning and its use of individualized and collaborative filtering combine to deliver Internet content that is highly personalized, relevant, timely, and valuable to the user. As a Research Programmer working on WiseWire, you will have the opportunity to see the practical application of your research work. We are seeking students or professionals with a background in statistics, information retrieval, collaborative filtering, agents, or machine learning. Candidates must have an advanced degree in a computer related field and extensive C++ programming experience Empirical Media offers you the opportunity to work in a well-financed and supportive business environment in close proximity to prominent universities on the cutting edge of technology (we are just down the block from both Carnegie Mellon University and the University of Pittsburgh). We also offer the chance for you to join a team of bright and dedicated technologists who are diligently working to revolutionize the way people view the Internet. And, our substantial stock option program allows you to directly participate in the value that you would be helping to create. For confidential consideration for this position, please send, email, or fax your resume or letter of interest to: Empirical Media Corporation 5001 Centre Avenue Pittsburgh, PA 15213 email: jobs@empirical.com fax: 412-688-8853 ------------------------------ From: Steve Cartmell Date: Wed, 5 Feb 1997 18:20:49 +0000 Subject: PA EXPO97 UPDATE PRACTICAL APPLICATION EXPO97 ============================== CONFERENCE UPDATE =================== Westminster Central Hall, London, 21-25 April, 1997 The Practical Application EXPO97 brings together four events under one roof: PAAM97 - The Practical Application of Intelligent Agents and Multi-Agents; PADD97- The Practical Application of Knowledge Discovery and Data Mining; PACT97-The Practical Application of Constraint Technology and PAP97-The Practical Application of Prolog. PLEASE VISIT OUR RECENTLY UPDATED WEB PAGES FOR FURTHER INFORMATION ON Tutorials Invited Talks Exhibition Venue Hotel reservations Registration http://www.demon.co.uk/ar/Expo97/ http://www.demon.co.uk/ar/PAP97/ http://www.demon.co.uk/ar/PACT97/ http://www.demon.co.uk/ar/PAAM97/ http://www.demon.co.uk/ar/PADD97/ The Practical Application Company PO Box 137 Blackpool Lancs FY2 9UN UK Tel: +44 (0)1253 358081 Fax: +44 (0)1253 353811 email: info@pap.com WWW: http://www.demon.co.uk/ar/TPAC/ ------------------------------ From: Sandip Sen Date: Sat, 15 Feb 1997 11:00:30 -0600 Subject: CFP: AAAI Workshop on Multiagent Learning AAAI-96 Workshop on Multiagent Learning Description ___________ This workshop addresses the requirements for agents to learn and adapt in the presence of other agents. Of particular relevance for the workshop are the applicability and limitations of current machine learning techniques for multiagent problems, and new learning and adaptive mechanisms particularly suited to them. Topics of interest __________________ Among others, papers discussing following topics are welcome: __ Evaluating effectiveness of individual learning strategies, or multistrategy combinations, in cooperative/competitive scenarios __ Characterizing learning methods in terms of modeling power, communication abilities, knowledge requirement, processing abilities of agents. __ Co-evolving multiple agents with similar/opposing interests. __ Teacher-student relationships among agents. __ Specific applications demonstrating how multiagent systems benefit from learning. Workshop format _______________ The workshop will contain paper sessions on common themes with panels at the end to compare/contrast the research presented. We plan to host an invited speaker and a panel discussion on key multiagent learning issues. Workshop attendance ___________________ Participation will be by invitation only (limited to 40 people). Submission Requirements: _______________________ E-mail postscript copy of one of the following to sandip@kolkata.mcs.utulsa.edu: __ brief statement of interest (1 page), __ complete paper (6 pages maximum) including keywords, authors' complete address. Direct correspondence to: Sandip Sen (Workshop Chair) Department of Mathematical & Computer Sciences, University of Tulsa, 600 South College Avenue, Tulsa, OK 74104-3189. OFFICE: 918-631-2985 FAX: 918-631-3077. e-mail: sandip@kolkata.mcs.utulsa.edu Important Dates _______________ Deadline for paper submission: March 11, 1997 Acceptance notice to participants: April 1, 1997 Camera-ready papers due: April 22, 1997 Workshop Committee ____________________ Gordon, Diana Naval Research Laboratory e-mail: gordon@aic.nrl.navy.mil Hogg, Tad Xerox PARC e-mail: hogg@parc.xerox.com Lesser, Victor University of Massachusetts e-mail:LESSER@cs.umass.edu Sen, Sandip (Chair) University of Tulsa e-mail: sandip@kolkata.mcs.utulsa.edu Veloso, Manuela CMU e-mail: veloso@cs.cmu.edu Weiss, Gerhard Technische Universitaet Muenchen E-mail: weissg@informatik.tu-muenchen.de ------------------------------ From: Craig Macdonald Date: Mon, 17 Feb 1997 11:13:30 +0900 Subject: Job Opportunity for Mathematician/Programmer in Kyoto Kyoto, Japan Anyone? This is a job at ATR in Kyoto, Japan that starts this Spring (preferably in May), or as soon thereafter as possible, and lasts for a year to a year and a half. The slot is for someone with heavy C++ experience and skills, who is also a seasoned applied mathematician with expertise in prediction, and who has spent time in industry, on large-scale, real-world prediction problems. For the past two years, this job has been occupied by Stephen Eubank, of SFI, The Prediction Company, and Los Alamos. He is now ready to move on, but has enjoyed his work and his life here, and anyone considering this job may wish to contact him at eubank@itl.atr.co.jp for his views and experiences here. (You might also contact Tom Ray, ray@hip.atr.co.jp, who has been at ATR for the past three years, to ask about life at this lab, and in Japan, more generally.) Previous to that, the job belonged to Turing International Programming Prize-winner David Magerman, formerly of IBM Watson Research Center, then BBN, and now an equity-"quant"-fund technical manager with Renaissance, Inc. We also have an ongoing consultancy relationship with John Lafferty, a research professor at CMU, and prior to that at IBM Watson Research, and before that, of the Harvard math faculty. The person who takes on the job we're talking about would be interacting with John, as well as with other interesting people in the field of prediction. Basically this job has two aspects: (1) maintaining and troubleshooting a large C++ system we have created, which probabilistically assigns grammatical structure to English sentences (i.e. "diagrams" or "parses" sentences); our current models use CART-type probabilistic decision trees; and (2) experimenting with variations on our current models, so as to improve performance. The purpose of the parsing program is to help in tasks such as speech-to-speech Japanese-to-English translation (we are working on a Japanese version of the software as well); speech and handwriting recognition; and information retrieval. Projects of all these sorts exist in the ATR laboratories, and one of the satisfactions of this job will probably be seeing this software "plugged into" such applications over the next year or so. Salary and conditions are all quite competitive. Serious candidates should be driven both by the technical challenge and by a sense of adventure, since life in Japan is very different from Western life in any country, and one needs a genuine curiosity plus a sense of humor to enjoy it. Also, please apply only if you're a very hard worker; we have a lot to get done in the next year and a half or so. Here is the ad we've been circulating for this position. Please contact black@itl.atr.co.jp if this job interest you. Thanks very much. Ezra Black Manager Statistical Parsing Group Natural Language Processing Department Interpreting Telecommunications Laboratories Advanced Telecommunications Research Laboratories (ATR) Kyoto, Japan Mathematician (Probability) / Programmer Job Opportunity Statistical Natural Language Processing ATR Interpreting Telecommunications Research Laboratories Kyoto, Japan ATR in Kyoto, Japan is looking for a talented and experienced mathematician (good probability background) / computer scientist who is interested in coming to Japan and joining an exciting research project in statistical natural language processing. The overall goal of the project is speech-to-speech Japanese to English and English to Japanese translation. The research group which the candidate would become part of has developed and is refining a natural language parsing system based on statistical machine learning algorithms. A qualified candidate must have the Ph.D. degree, preferably math, physics, computer science; with extensive programming experience and expertise in C++. Ability to innovate approaches to applied mathematical problems in the area of probability required. Experience with statistical modelling within general pattern recognition field also important. Familiarity with natural language field a plus but not absoutely necessary. Candidate should know UNIX well and in general be a highly expert programmer. This is not a job for those who are theoreticians only, but rather is for someone who is equally at home with issues of statistical modelling theory, experimental method, and detailed coding matters. Motif experience would be nice, though again not required. The position involves supervision of several researchers and programmers as well. Salary and conditions highly competitive. Please contact: black@itl.atr.co.jp. Thank you. ------------------------------ From: Takao Terano Date: Wed, 05 Feb 1997 19:33:21 +0900 Subject: CFP: IJCAI-97 Workshop on VV & Refinement of AI Systems & Subsystems CALL FOR PARTICIPATION IJCAI-97 Workshop on Validation, Verification & Refinement of AI Systems & Subsystems Nagoya, Japan, One day during August 23-25, 1997 ************** WORKSHOP GOALS: ************** Since the IJCAI/AAAI in Detroit, in 1989, both IJCAI and AAAI have lively hosted the workshops focused on validation and verification (V&V) of knowledge based systems. Following the tradition, the workshop focuses on both theoretical and empirical problems related to V&V. In this workshop, especially the concepts of knowledge based systems are extended to the other AI-related systems, which will include neural net, fuzzy, evolutionary computation, and/or multiagent systems and subsystems. This workshop will mainly focus on following but not limited the topics: 1. V&V methods for AI related systems with KBSs, ANNs, Fuzzy logic, GAs, and/or Multi Agent technologies. 2. V&V methods for AI related subsystems that are components of larger systems with/without conventional technologies. 3. Novel V&V methods and theories for AI systems/subsystems. 4. Development of methodologies and case studies of V&V of AI systems/ subsystems. Papers on these and also on more traditional V&V subjects such as testing, tool development, correctness verification, knowledge modeling, formal methods, etc., as well as reports on existing systems, are encouraged. **************** IMPORTANT DATES: **************** Deadline for paper indication: March 28, 1997 Deadline for paper or abstract submission: March 31, 1997 Author Notification: April 10, 1997 Camera-ready version of full papers: April 25, 1997 Deadline for statements of interest: April 25, 1997 Workshop: one day during August 23-25, 1997 * Note: The deadlines are very *strict* ones. **********************- SUBMISSION INFORMATION: **********************- Those interested in presenting should submit a short paper (up to 10 pages) or abstract (up to 2 pages) describing novel research and/or results. Those interested in participation only should describe their interest in the area and/or industrial experience (up to 2 pages). Email submissions are encouraged: please use plain text, LaTeX or PostScript. Fax submissions are acceptable. For hard copy submissions, four (4) copies should be provided. Surface mail and email addresses, and phone and fax numbers should be included for all contributing authors. Submit your paper to the workshop chair: Prof. Takao Terano (Chair) email: terano@gssm.otsuka.tsukuba.ac.jp Graduate School of Systems Management, The University of Tsukuba, Tokyo 3-29-1 Otsuka, Bunkyo-ku, Tokyo 112, Japan tel.: +81-3-3942-6855 fax.: +81-3-3942-6829 Send title of your paper via e-mail to terano@gssm.otsuka.tsukuba.ac.jp by March 28, 1997. It will be used to speed-up the review process. If you have a system that you would like to demonstrate, also, please contact the workshop chair. ****************- WORKSHOP FORMAT: ****************- - one day, closed workshop, limited up to 50 participants. - paper presentation (30 minutes presentation with discussions about 10 papers will be presented.) - panel discussions (Two panels will be scheduled, One will focus on empirical issues and the other will concern theoretical ones related to recent topics presented the other VV&T workshops (EUROVAV, AAAI, etc.)) - software demonstrations (If possible, we will prepare additional computers) - discussion on future research themes ******************** FURTHER INFORMATION: ******************** All workshop participants must register for the IJCAI conference and a fee of US$ 50.00 to each participant will be charged, in addition to the normal IJCAI-97 registration fee. The workshop participation is limited based on the selection of the workshop organizer. Visit http://www.brl.ntt.jp/jsai/IJCAI-97 for more information on IJCAI-97 in general or http://www.gssm.musashi.ac.jp/persons/terano/ijcai-vvt-wrkshp.html on this workshop. Send a mail to terano@gssm.otsuka.tsukuba.ac.jp for further questions. ******************** WORKSHOP ORGANIZERS: ******************** Takao Terano (Chair) email: terano@gssm.otsuka.tsukuba.ac.jp Graduate School of Systems Management, The University of Tsukuba, Tokyo 3-29-1 Otsuka, Bunkyo-ku, Tokyo 112, Japan tel.: +81-3-3942-6855 fax.: +81-3-3942-6829 Dan O'Leary email: oleary@rcf.usc.edu University of Southern California 3660 Trousdale Parkway, Los Angeles, CA 90089-1421, USA Grigoris Antoniou, email: ga@cit.gu.edu.au Griffith University, Queensland, Australia, 41111 Anca I. Vermesan, email : anve@vr.dnv.no Det Norske Veritas Research, Veritasveien 1, 1322 Hovik, Norway Jim Schmolze, email: schmolze@cs.tufts.edu Tufts University, Rose Gamble, email: gamble@tara.mcs.utulsa.edu University of Tulsa, Kaoru Kobayashi email: kaoru@ai.rcast.u-tokyo.ac.jp University of Tokyo, Japan ------------------------------ From: Asim Roy Date: Wed, 19 Feb 1997 12:26:53 -0500 (EST) Subject: Does plasticity imply local learning? And other questions A predominant belief in neuroscience is that synaptic plasticity and LTP/LTD imply local learning.. It is a possibility, but it is not the only possibility. Here are some thoughts on some of the other possibilities (e.g. global learning mechanisms or a combination of global/local mechanisms) and some discussion on the problems associated with "pure" local learning. The local learning idea is a very core idea that drives research in a number of different fields. I welcome comments on the questions and issues raised here. This note is being sent to many listserves. I will collect all of the responses from different sources and redistribute them to all of the participating listserves. The last such discussion was very productive. It has led to the realization by some key researchers in the connectionist area that "memoryless" learning perhaps is not a very "valid" idea. That recognition by itself will lead to more robust and reliable learning algorithms in the future. Perhaps a more active debate on the local learning issue will help us resolve this issue too. A) Does plasticity imply local learning? The physical changes that are observed in synapses/cells in experimental neuroscience when some kind of external stimuli is applied to the cells may not result at all from any specific "learning" at the cells. The cells might simply be responding to a "signal to change" - that is, to change by a specific amount in a specific direction. In animal brains, it is possible that the "actual" learning occurs in some other part(s) of the brain, say perhaps by a global learning mechanism. This global mechanism can then send "change signals" to the various cells it is using to learn a specific task. So it is possible that in these neuroscience experiments, the external stimuli generates signals for change similar to those of a global learning agent in the brain and that the changes are not due to "learning" at the cells themselves. Please note that scientific facts and phenomenon like LTP/LTD or synaptic plasticity can probably be explained equally well by many theories of learning (e.g. local learning vs. global learning, etc.). However, the correctness of an explanation would have to be judged from its consistency with other behavioral and biological facts, not just "one single" biological phenomemon or fact. B) "Pure" local learning does not explain a number of other "activities" that are part of the process of learning!! When learning is to take place by means of "local learning" in a network of cells, the network has to be designed prior to its training. Setting up the net before "local" learning can proceed implies that an external mechanism is involved in this part of the learning process. This "design" part of learning precedes actual training or learning by a collection of "local learners" whose only knowledge about anything is limited to the local learning law to use! In addition, these "local learners" may have to be told what type of local learning law to use, given that a variety of different types can be used under different circumstances. Imagine who is to "instruct and set up" such local learners which type of learning law to use? In addition to these, the "passing" of appropriate information to the appropriate set of cells also has to be "coordinated" by some external or global learning mechanism. This coordination cannot just happen by itself, like magic. It has to be directed from some place by some agent or mechanism. In order to learn properly and quickly, humans generally collect and store relevant information in their brains and then "think" about it (e.g. what problem features are relevant, complexity of the problem, etc.). So prior to any "local learning," there must be processes in the brain that "examine" this "body of information/facts" about a problem in order to design the appropriate network that would fit the problem complexity, select the problem features that are meaningful, etc. It would be very difficult to answer the questions "What size net?" and "What features to use?" without looking at the problem (body of information) in great detail. A bunch of "pure" local learners, armed with their local learning laws, would have no clue to these issues of net design, generalization and feature selection. So, in the whole, there are a "number of activities" that need to be performed before any kind of "local learning" can take place. These aforementioned learning activities "cannot" be performed by a collection of "local learning" cells! There is more to the process of learning than simple local learning by individual cells. Many learning "decisions/tasks" must precede actual training by "local learners." A group of independent "local learners" simply cannot start learning and be able to reproduce the learning characteristics and processes of an "autonomous system" like the brain. Local learning or local computation, however, is still a feasible idea, but only within a general global learning context. A global learning mechanism would be the one that "guides" and "exploits" these local learners or computational elements. However, it is also possible that the global mechanism actually does all of the computations (learning) and "simply sends signals" to the network of cells for appropriate synaptic adjustment. Both of these possibilities seem logical: (a) a "pure" global mechanism that learns by itself and then sends signals to the cells to adjust, or (b) a global/local combination where the global mechanism performs certain tasks and then uses the local mechanism for training/learning. Thus note that the global learning mechanism may actually be implemented with a collection of local learners or computational elements!! However, certain "learning decisions" are made in the global sense and not by "pure" local learners. The basic argument being made here is that there are many tasks in a "learning process" and that a set of "local learners" armed with their local learning laws is incapable of performing all of those tasks. So local learning can only exist in the context of global learning and thus is only "a part" of the total learning process. It will be much easier to develop a consistent learning theory using the global/local idea. The global/local idea perhaps will also give us a better handle on the processes that we call "developmental" and "evolutionary." And it will, perhaps, allow us to better explain many of the puzzles and inconsistencies in our current body of discoveries about the brain. And, not the least, it will help us construct far better algorithms by removing the "unwarranted restrictions" imposed on us by the current ideas. Any comments on these ideas and possibilities are welcome. Asim Roy Arizona State University ------------------------------ From: Wray Buntine Date: Tue, 4 Feb 97 22:11:18 PST Subject: Research Scientist for autonomous data analysis NASA's Center of Excellence in Information Technology at Ames Research Center invites candidates to apply for a position as Research Scientist in Information Technology: Position description: * We seek applicants to join a small team of space scientists and computer scientists in developing NASA's next generation smart spacecraft with on-board, autonomous data analysis systems. The group includes leading space scientists (Ted Roush, Virginia Gulick) and leading data analysts (Wray Buntine, Peter Cheeseman), and their counterparts at JPL. * The team is doing the research and development required for the task, and has a multi-year program with deliverables planned. This is not a pure research position, and requires dedication in seeing completion of the R&D milestones. * The applicant will be responsible for the information technology side of R&D, with guidance from senior space scientists on the project. * The research has strong links with on-going work at the Center of Excellence and is an integral part of NASA's long term goals. Candidate requirements: * Strong interest in demonstrating autonomous analysis systems to enhance science understanding in operational tests, with the ultimate goal of putting such systems in space. * Ph.D. degree in Computer Science, Electrical Engineering, or related field, and applied experience, possibly within the PhD. In exceptional cases, an M.S. degree with relevant work experience will suffice. * Knowledge of neural or probabilistic networks, machine learning, statistical pattern recognition, image processing, science data, processing, probabilistic algorithms, or related topics is essential. * Strong communication and organizational skills with the ability to lead a small team and interact with scientists. * Strong C programming and Unix skills (experimental, not necessarily production), with experience in programming mathematical algorithms: C++, Java, MatLab, IDL. Application deadline: * March 15th, 1997 (hardcopy required -- see below). Please send any questions by e-mail to the addresses below, and type "PI for Autonomous data analysis" as your header line. Dr. Ted Roush: roush@cloud9.arc.nasa.gov Dr. Wray Buntine: buntine@cloud9.arc.nasa.gov Full applications (which must include a resume and the names and addresses of at least two people familiar with your work) should be sent by surface mail (no e-mail, ftp or html applications will be accepted) to: Dr. Steve Lesh Attn: PI for Autonomous data analysis Mail Stop 269-1 NASA Ames Research Center Moffett Field, CA, 94025-1000 ------------------------------ From: Blaz Zupan Date: Wed, 5 Feb 1997 10:34:47 +0100 (MET) Subject: IDAMAP-97: Reminder and brief Second CFP Reminder and brief Second Call for Papers for IDAMAP-97 INTELLIGENT DATA ANALYSIS IN MEDICINE AND PHARMACOLOGY Saturday, August 23, 1997 Workshop W15 at IJCAI-97 August 23-29, 1997, Nagoya, Japan Paper submission deadline is March 3, 1997. Submit 8-12 page papers by e-mail (postscript) and 3 hard-copies by surface mail to: Nada Lavrac, Blaz Zupan J. Stefan Institute, Jamova 39, SI-1000 Ljubljana, Slovenia email: idamap97@ijs.si For up-to-date workshop information please check: http://www-ai.ijs.si/ailab/activities/idamap97.html ------------------------------ From: foster@nynexst.com Date: Wed, 12 Feb 1997 10:05:48 +0500 Subject: 3 positions at NYNEX S&T [Those interested should contact Yuling Wu, as indicated below. -F] Job Openings at NYNEX Science and Technology The Integrated Network Services Testing & Analysis (INSTA) group at NYNEX Science & Technology has three openings for knowledge-based (KB) diagnostic system developers. The group is involved in building monitoring, testing and diagnostic systems using state of the art AI technologies for advanced Telecom networks and circuits. The group has been building systems that support complete testing and diagnosis of circuits, both from the central office and in the field. Systems already built and deployed test and diagnose residential telephone lines and some of the business services. In addition to these, the group is currently looking at ISDN and broadband services. The selected candidate would work on one or more of the following projects: - Building KB system for assisting field technicians out in the field in testing and troubleshooting faults in telecomm circuits. The candidate will also explore complementing this KB with the KB performing centralized testing from the Central Office. - Building KB system for automated centralized testing and diagnosis of Special (Buisness) service circuits. - Building monitoring, testing, and diagnostic systems for broadband circuits. - Building an intelligent interactive assistant to aid testers in testing and diagnosing circuits. Suitable candidates must have the following: =========================================== - Background in Computer Science or Computer Engineering or Electrical Engineering. - Experience in all aspects of building knowledge-based systems including knowledge acquisition, knowledge engineering, domain and task modeling, testing, validation, and evaluation of the knowledge-based systems. - Good understanding of various AI techniques such as model-based reasoning, case-based reasoning, neural nets, and machine learning. - Good analytical skills. - Quick learner - to quickly acquire relevant domain knowledge. - Good system building experience Experience with the following would be a plus: =================================================================== - Knowledge of data-analysis tools (eg: statistical tools) - Unix, C, C++, LISP, ARTIM, CLIPS... - Distributed Client server Architectures - databases, database wharehousing - Telecomm experience: Operation Support Systems, Residential Lines, Special services, broadband services, telecomm network and circuit testing, alarm monitoring etc. If interested, please mail a hard copy of your resume to: Yuling Wu NYNEX Science & Technology 400 Westchester Av. White Plains, NY 10604 or email the postscript version to: yuling@nynexst.com ------------------------------ From: Luc De Raedt Date: Mon, 17 Feb 1997 15:05:21 +0100 (MET) Subject: IJCAI Workshop: FRONTIERS OF INDUCTIVE LOGIC PROGRAMMING ========================================================================== CALL FOR PARTICIPATION and PAPERS IJCAI-97 Workshop on FRONTIERS OF INDUCTIVE LOGIC PROGRAMMING Monday 25 August 1997 ========================================================================== GENERAL INFORMATION The IJCAI-97 one day workshop on "Frontiers of ILP" in Nagoya, Japan, will take place on August 25, immediately prior to the start of the main IJCAI conference. TECHNICAL DESCRIPTION Inductive logic programming (ILP) is a recent subfield of artificial intelligence that studies the induction of first order formulae from examples. The purpose of this workshop is twofold: on the one hand, we wish to widen the scope of ILP by investigating its relations to neighboring fields, and on the other hand, we wish to make ILP more accessible for researchers from neighboring fields. The workshop therefore solicits papers that lie at the frontiers of ILP with neighboring fields. A non-exclusive list of interesting topics for the workshop includes : * ILP and Software Engineering: what has ILP to offer to Software Engineering ?, and in what way can Software Engineering help to design ILP systems and applications ? * ILP for Knowledge Discovery in Databases : ILP aims at learning complex rules involving multiple relations from small databases, whereas KDD typically induces simple rules about a single relation from a large database. Furthermore, ILP allows to exploit background knowledge in a variety of ways. Can KDD and ILP be succesfully combined ? * ILP and Computational or Algorithmic Learning Theory : though many results have been obtained concerning the learnability of inductive logic programming, most of the results are negative and most of the positive results are reducible to propositional learning methods. Is there a mismatch of COLT with ILP ? and if so, what can be done about it ? * ILP versus propositional learning methods : Since the very start of ILP, researchers and practioners of machine learning have wondered about the relation between ILP and propositional learning methods. Theoretical and experimental questions that arise include: when to use ILP and when to use propositional learning methods ? under what circumstances can ILP be reduced to propositional learning ? what is the price to pay for using first order logic in terms of efficiency ? * ILP and Knowledge Representation : ILP has traditionally employed computational logic to represent hypotheses and observations. Alternative well-founded knowledge representation formalisms have received little attention (with the exception of CLASSIC). What can ILP learn from Knowledge Representation ? and in what well-founded Knowledge Representation formalisms is induction feasible ? * ILP in multistrategy learning : Multistrategy learning combines multiple learning strategies. What role can ILP play for multistrategy learning ? * ILP and Probabilistic reasoning: in contrast to propositional learning methods, ILP has not used probabilistic representations. How can ILP incorporate such representations ? and how can it interact with methods such as Bayes nets or Hidden Markov Models ? * ILP for Intelligent Information Retrieval: The rapid development of the World Wide Web has spawned significant interest in intelligent information retrieval. In particular, the need for algorithms for reliably classifying textual documents into given categories (like interesting/uninteresting) be useful for a wide variety of tasks. Currently, most learning algorithms are not able to make use of structural information like word order, succesive words, structure of the text, etc. Can ILP algorithms offer advantages over conventional information retrieval or machine learning algorithms for this sort of tasks? * Applications of ILP in subfields of AI : ILP has been applied to other subfields of AI, including natural language processing, intelligent agents and planning. Further applications of ILP within AI are solicited. Both position papers about the relation of ILP to other fields, as well as research papers that make specific techical contributions are solicited. However, to stimulate discussion, it is expected that each technical paper also clarifies the position of ILP with regard to the neighboring field(s) it addresses. Except for the presentation of position and technical papers, the workshop will also feature a panel discussion on the frontiers of ILP and possibly an invited talk. ORGANISERS Luc De Raedt (chair and primary contact) Saso Dzeroski Koichi Furukawa Fumio Mizoguchi Stephen Muggleton PROGRAMME COMMITTEE Francesco Bergadano (Italy) Luc De Raedt (co-chair, Belgium) Saso Dzeroski (Slovenia) Johannes Furnkranz (Austria) Koichi Furukawa (Japan) David Page (U.K.) Fumio Mizoguchi (Japan) Ray Mooney (U.S.A.) Stephen Muggleton (co-chair, U.K.) CALL FOR PARTICIPATION Participation is open to all members of the AI Community. However, to encourage interaction and a broad exchange of ideas the number of participants will be strictly limited (preferably under 30 and certainly under 40). Participants will be selected on the basis of submissions. Three types of submissions will be considered : 1) technical contributions (ideally, a 3 to 5 page extended abstract, in the IJCAI Proceedings Format, 3000-4000 words), 2) position papers (ideally, a 1 to 3 page abstract in the IJCAI Proceedings Format, 1000 - 3000 words) 3) a statement of interest (ideally, a one page motivation of why you would like to participate, 300- 500 words) Only submissions of type 1) and 2) will be considered for presentation at the workshop and inclusion in the workshop notes. Submissions should be received no later than April 1, 1997, and must include first author's complete contact information, including address, email, phone, and fax number. Though 1 April is the hard deadline, the authors are encouraged to submit their material by 24 March, in order to facilitate the reviewing process. Double submissions with the ILP-97 Workshop (which is to take place in Prague, September 1997) are allowed. SUBMISSIONS Submit papers by email (postscript) and surface mail (2 copies) to Luc De Raedt Dept. of Computer Science Katholieke Universiteit Leuven Celestijnenlaan 200A B-3001 Heverlee Belgium Email : Luc.DeRaedt@cs.kuleuven.ac.be IMPORTANT DATES - Paper submission : 1 April - Notification to Authors : 21 April - Camera ready copy : the submissions themselve will serve as camera ready copy (submissions in the IJCAI Proceedings Style are strongly preferred, see http://www.ijcai.org/ijcai-97/ for details) PUBLICATION The accepted submissions will be included in the workshop notes to be distributed at the workshop. Post-conference publication of a selection of the workshop papers will be considered and discussed at the workshop. COSTS To cover costs, a fee of $US 50 will be charged, in addition to the normal IJCAI-97 conference registration fee. Attendees of IJCAI workshops will be required to register for the main IJCAI conference. ------------------------------ From: Wei Zhang Date: Mon, 17 Feb 1997 16:50:20 -0800 Subject: Job: machine learning at Boeing **Outstanding Machine Learning Researcher needed** The Boeing Company, the world's largest aerospace company, is actively working research projects in advanced computing technologies including projects involving NASA, FAA, Air Traffic Control, and Global Positioning as well as airplane and manufacturing research. The Research and Technology organization located in Bellevue, Washington, near Seattle, has an open position for a machine learning researcher. We are the primary computing research organization for Boeing and have contributed heavily to both short term technology advances and to long range planning and development. BACKGROUND REQUIRED: Machine Learning, Knowledge Discovery, Data Mining, Statistics, Artificial Intelligence or related field. RESEARCH AREAS: We are developing and applying techniques for data mining and statistical analyses of diverse types of data, including: safety incidents, flight data recorders, reliability, maintenance, manufacturing, and quality assurance data. These are not areas where most large R&D data mining efforts are currently focused. Research areas include data models, data mining algorithms, statistics, and visualization. Issues related to our projects also include pattern recognition, multidimensional time series, and temporal databases. We can achieve major practical impacts in the short-term both at Boeing and in the airline industry, which may result in a safer and more cost-effective air travel industry. A Ph.D. in Computer Science or equivalent experience is highly desirable for the position. We strongly encourage diversity in backgrounds including both academic and industrial experiences. Knowledge of machine learning, statistics, and data mining are important factors. Experience with databases and programming (C/C++, JAVA, and Splus) is desirable. APPLICATION: If you meet the requirements and you are interested, please send your resume via electronic e-mail in plain ASCII format to zhangw@redwood.rt.cs.boeing.com (Wei Zhang). You can also send it via US mail to Wei Zhang The Boeing Company PO Box 3707, MS 7L-66 Seattle, WA 98124-2207 Application deadline is April 30, 1997. The Boeing Company is an equal opportunity employer. ------------------------------ From: Catherine.Garbay@imag.fr Date: Thu, 13 Feb 1997 13:30:25 +0000 Subject: AIME'97 Programme & Registration __________________________________ 6th Conference on ARTIFICIAL INTELLIGENCE in MEDICINE EUROPE 23rd - 26th March 1997 World Trade Center - Grenoble, France __________________________________ WELCOME TO AIME'97 The AIME'97 Conference is organized by the European Society for Artificial Intelligence in Medicine, and is the sixth in the Society's series of international biennial conferences. The AIME'97 team, consisting of the Organizing Committee and the Programme Committee, has endeavoured to provide unique educational and sharing opportunities in respect to the international state-of-the-art of Artificial Intelligenge in Medicine (AIM) from a research and application perspective. Each day of the conference will include an invited lecture in an area of interest for participants. At the end of the Conference, the programme will offer a round table discussion focusing on current strategic issues for the AIM Community. The Scientific Programme includes 56 presentations (33 papers and 23 posters) covering the following broad subject categories: * Protocols & Guidelines * Knowledge Acquisition & Learning * Decision-Support Theories * Diagnostic Problem Solving * Probabilistic Models & Fuzzy Logic * Temporal Reasoning & Planning * Natural Language & Terminology * Image & Signal Processing * Hybrid & Cooperative Systems and other areas relevant to AIM. As already announced the authors of the best papers will be invited to submit revised and extended versions of their papers to be considered for publication in a special issue of the AIM journal. These invitations will be sent after the Conference. Special attention has been given to the presentation of posters, which have been allocated two plenary sessions, with time for oral presentation. Alive demonstrations will also be made possible. Finally, a set of tutorials are organized as the first activity at the Conference on March 23rd. The Programme Committee thanks all the contributors to the Conference for making it possible to present such a scientifically outstanding programme. _______________________________ PROGRAMME COMMITTEE CHAIR Elpida Keravnou (Nicosia, Cyprus) ORGANIZING COMMITTEE CHAIR Catherine Garbay (Grenoble, France) Robert Baud (Geneva, Switzerland, co-Chair) TUTORIALS Chair Jeremy Wyatt (London, United Kingdom) PROGRAMME COMMITTEE Steen Andreassen (Aalborg, Denmark) Pedro Barahona (Lisboa, Portugal) Robert Baud (Geneva, Switzerland) Jan van Bemmel (Rotterdam, The Netherlands) Enrico Coiera (Bristol, United Kingdom) Carlo Combi (Milano, Italy) Luca Console (Torino, Italy) Michel Dojat (Cr=E9teil, France) Rolf Engelbrecht (Munich, Germany) John Fox (London, United Kingdom) Catherine Garbay (Grenoble, France) Werner Horn (Vienna, Austria) Jim Hunter (Aberdeen, United Kingdom) Nada Lavrac (Ljubljana, Slovenia) Stelios Orphanoudakis (Heraklion, Greece) Alan Rector (Manchester, United Kingdom) Costas Spyropoulos (Athens, Greece) Mario Stefanelli (Pavia, Italy) Mario Veloso (Lisboa, Portugal) John Washbrook (London, United Kingdom) Jeremy Wyatt (London, United Kingdom) LOCAL ORGANIZING COMMITTEE Jean-Dominique Monet Georges Weil Nicole Brochier Paulette Souillard Catherine Plottier Jacques Chevallier Pierre Kermen SECRETARIAT SELECTOUR-CONGRES BP 53, F-38041 GRENOBLE CEDEX 09, FRANCE Tel: + 33 476 44 86 26 - Fax: + 33 476 51 45 46 Email: dcongres@imaginet.fr WWW: http://www-timc.imag.fr/aime97 __________________________________ TUTORIALS - Sunday, March 23rd 1. Natural Language Processing Instructors: Anne-Marie Rassinoux (PhD, Vanderbilt University, Nashville, U.S.A) and Robert Baud (PhD, University Hospital of Geneva, Geneva, Switzerland) 2. Protocols, guidelines and clinical decision support systems: a practical introduction based on the PROforma methodology Instructors: John Fox and Nicky Johns (Imperial Cancer Research Fund, London, U.K.) 3. Evaluation of Artificial Intelligence Systems Instructors : Charles P. Friedman (Director of the Center for Biomedical Informatics at the University of Pittsburgh, U.S.A.) and Jeremy Wyatt (Senior research fellow at the Centre for Statistics in Medicine, Institute for Health Sciences, Oxford University and consultant in medical informatics, Imperial Cancer Research Fund, London, U.K.). 4. Development, Evaluation, and Dissemination of Diagnostic Decision Support Systems (with QMR as a primary example) Instructors : Randolph A. Miller (Vanderbilt University, U.S.A. ; Randolph A. Miller is the author of QMR) __________________________________ PROGRAMME MONDAY, March 24th 09. 00 OPENING CEREMONY PROTOCOLS & GUIDELINES Chair: Mario Veloso (Portugal) 09. 25 Protocols for medical procedures and therapies: a provisional description of the PROforma language and tools J. Fox, N. Johns and A. Rahmanzadeh (United Kingdom) 09. 50 Supporting tools for guideline development and dissemination S. Quaglini, R. Saracco, M. Stefanelli, and C. Fassino (Italy) 10. 15 A task-specific ontology for the application and critiquing of time-oriented clinical guidelines Y. Shahar, S. Miksch and P. Johnson (U.S.A.) 10. 40 COFFEE BREAK KNOWLEDGE ACQUISITION & LEARNING Chair: Nada Lavrac (Slovenia) 11. 10 Detecting very early stages of dementia from normal aging with machine learning methods W.R. Shankle, S. Mani, M. Pazzani, P. Smyth (U.S.A.) 11. 35 Acquiring and validating background knowledge for machine learning using function decomposition B. Zupan and S. Dzeroski (Slovenia) 12. 00 Automated revision of expert rules for treating acute abdominal pain in children S. Dzeroski, G. Potamias, V. Moustakis and G. Charissis (Slovenia-Greece) 12. 25 Evaluation of automatic and manual knowledge acquisition for cerebrospinal fluid (CSF) diagnosis A. Ultsch, T.O. Kleine, D. Korus, S. Farsch, G. Guimaraes, W. Pietzuch and J. Simon (Germany) 13. 00 LUNCH 14. 30 INVITED TALK Chair: Steen Andreassen (Denmark) "Robots as Surgical Assistants: Where we are, where we are tending, and how to get there." Professor Russel H. Taylor (U.S.A.) 15. 30 POSTER SESSION I Chair: Steen Andreassen (Denmark) Protocols & Guidelines * User-adapted multimedia explanations in clinical guidelines consultation system B. De Carolis, G. Rumolo and V. Cavallo (Italy) * Algorithm and care pathway: clinical guidelines and healthcare processes C. Gordon, P. Johnson, C. Waite and M. Veloso (United Kingdom-Portugal) Knowledge Acquisition & Learning * Knowledge acquisition by the domain expert using the tool HEMATOOL J.P. Du Plessis and C.J. Tolmie (South Africa) * Application of inductive logic programming for learning ECG waveforms G. Kokai, Z. Alexin and T. Gyimothy (Hungary) * Knowledge discovery from a breast cancer database S. Mani, M. Pazzani and J. West (U.S.A.) * An adaptive two-tier menu approach to support on-line entry of diagnoses S.K. Mudali, J.R. Warren and S.E. Spenceley (Australia) * Machine learning applied to diagnosis of sport injuries I. Zelic, I. Kononenko, N. Lavrac and V. Vuga (Slovenia) Decision-Support Theories * A new approach to feature extraction M. Scherf (Germany) Diagnostic Problem Solving * The clinical spectrum of decision-support in oncology with a case report of a real world system A. Geissbuhler, R.A. Miller and W.W. Stead (U.S.A.) * A case-based reasoning method for computer-assisted diagnosis in histopathology M-C. Jaulent, C. Le Bozec, E. Zapletal and P. Degoulet (France) * Advances in heuristic shell design: combining rule syntax power with graphical knowledge acquisition B. Puppe, F. Puppe and S. Bamberger (Germany) * The validation of an expert system for diagnosis of acute myocardial infraction A. Rabelo Jr, A.R. Rocha, A. Souza, A. Araujo, A. Ximenes, C. Andrade, D. Onnis, I. Olivaes, K. Oliveira, N. Lobo, N. Ferreira, S. Lopes and V. Werneck (Brasil) 16. 15 COFFEE BREAK DECISION-SUPPORT THEORIES Chair: Werner Horn (Austria) 16. 45 A theoretical framework for decision trees in uncertain domains: application to medical data sets B. Cremilleux and C. Robert (France) 17. 10 Developing a decision-theoretic network for a congenital heart disease N. Peek and J. Ottenkamp (The Netherlands) 17. 35 A theory of medical diagnosis as hypothesis refinement P. Lucas (The Netherlands) 19. 00 WELCOME COCKTAIL __________________________________ TUESDAY, March 25th DIAGNOSTIC PROBLEM SOLVING Chair : Mario Stefanelli (Italy) 09. 00 A heuristic approach to the multiple diagnoses problem R.A. Miller (U.S.A.) 09. 25 Intelligent assistance for coronary heart disease diagnosis: a comparison study G. Bologna, A. Rida and C. Pellegrini (Switzerland) 09. 50 Diagnosis and monitoring of ulnar nerve lesions J. Rahmel, C. Blum, P. Hahn and B. Krapohl (Germany) 10. 15 Hypothesist: a development environment for intelligent diagnostic systems D. McSherry (United Kingdom) 10. 40 COFFEE BREAK PROBABILISTIC MODELS & FUZZY LOGIC Chair:John Fox (United Kingdom) 11. 10 A causal-functional model applied to EMG diagnosis J. Cruz and P. Barahona (Portugal) 11. 35 Learning Bayesian networks by genetic algorithms. A case study in the prediction of survival in malignant skin melanoma P. Larranaga, B. Sierra, M.J. Gallego, M.J. Michelena and J.M. Picaza (Spain) 12. 00 A neuro-fuzzy-classifier for a knowledge-based glaucoma monitor G. Zahlmann, M. Scherf and A. Wegner (Germany) 12. 25 A method for diagnosing in large medical expert systems based on causal probabilistic networks M. Suojanen, K.G. Olesen, and S. Andreassen (Denmark) 13. 00 LUNCH 14. 30 INVITED TALK Chair: Costas Spyropoulos (Greece) "Intelligent Image Management in an Integrated Telemedicine Services Network." Professor Stelios Orphanoudakis (Crete) 15. 30 POSTER SESSION II Chair: Costas Spyropoulos (Greece) Probabilistic Models & Fuzzy Logic * Dynamic decision making in stochastic partially observable medical domains: ischemic heart disease example M. Hauskrecht (U.S.A) * Self-learning fuzzy logic control in medicine D.G. Mason, D.A. Linkens and N.D. Edwards (United Kingdom) Temporal Reasoning & Planning * Medical planning environment D. Ziebelin and A. Vila (France) Natural Language & Terminology * Multilingual decision-support for the diagnosis of acute abdominal pain: A European Concerted Action (COPERNICUS 555) Ohmann, H.P. Eich, E. Kein and the COPERNICUS Abdominal Pain Study Group (Germany) * Medical concept systems, lexicons and natural language generation J.C. Wagner, C. Lovis, R.H. Baud and J-R. Scherrer (Switzerland) Image & Signal Processing * Rule-based labeling of CT head image D. Cosic and S. Loncaric (Croatia) * Characterisation of tumorous tissue in rat brain by in vitro magnetic resonance spectroscopy and artificial neural networks T. Derr, T. Els, M.L. Gyngell and D. Leibfritz (Germany) * An application of machine learning in the diagnosis of ischaemic heart disease M. Kukar, C. Groselj, I. Koronenko, and J.J. Fettich (Slovenia) Hybrid & Cooperative Systems * Case-based reasoning and statistics for discovering and forecasting of epidemics M. Bull, G. Kundt and L. Gierl (Germany) * Knowledge refinement of an expert system using a symbolic-connectionist approach D. Lorenzo, J. Santos, S. Gomez, J. Heras and R.P. Otero (Spain) * Integrated decision support: the DIADOQ computer-based patient record W. Moser and T. Diedrich (Germany) * Extending a symbolic problem solver to handle incomplete and uncertain information W.M. Post (The Netherlands) 16. 15 COFFEE BREAK TEMPORAL REASONING & PLANNING Chair: Pedro Barahona (Portugal) 16. 45 Planning and scheduling patient tests in hospital laboratories C.D. Spyropoulos, S. Kokkotos and C. Marinagi (Greece) 17. 10 Temporal abstractions for diabetic patients management C. Larizza, R. Bellazzi and A. Riva (Italy) 17. 35 Temporal scenario recognition for intelligent patient monitoring N. Ramaux, D. Fontaine and M. Dojat (France) 19. 00 CONFERENCE DINNER __________________________________ WEDNESDAY, March 25th NATURAL LANGUAGE & TERMINOLOGY Chair:Rolph Engelbrecht (Germany) 09. 00 Strengthening argumentation in medical explanations by text plan revision F. de Rosis, F. Grasso and D.C. Berry (Italy) 09. 25 An ontological analysis of surgical deeds A. Rossi Mori, A. Gangemi, G. Steve, F. Consorti and E. Galeazzi (Italy) 09. 50 Building medical dictionaries for patient encoding system, a methodology C. Lovis, R Baud, A.M.Rassinoux, P.A.Michel and J.R.Scherrer (Switzerland) 10. 15 A semantics-based communication system for dysphasic subjects P. Vaillant (France) 10. 40 COFFEE BREAK IMAGE & SIGNAL PROCESSING Chair: Michel Dojat (France) 11. 10 Distributed plan construction and execution for medical image interpretation" N. Bianchi, P. Bottoni, C. Garbay, P. Mussio and C. Spinu (Italy-France) 11. 35 Improved identification of the human shoulder kinematics with muscle biological filters J-P Draye, G. Cheron, D. Pavisic and G. Libert (Belgium) 12. 00 A society of goal-oriented agents for the analysis of living cells A. Boucher, A. Doisy, X. Ronot and C. Garbay (France) 12. 25 Methodology for the design of digital brain atlases B. Gibaud, S. Garlatti, C. Barillot and E. Faure (France) 13. 00 LUNCH 14. 00 INVITED TALK Chair: Enrico Coiera (United Kingdom) Artificial Intelligence Technologies : Conditions for Further Impact Professor Jean-Raoul Scherrer (Switzerland) HYBID & COOPERATIVE SYSTEMS Chair: Enrico Coiera (United Kingdom) 15. 00 Meta-level learning in a hybrid knowledge-based architecture E. Christodoulou and E.T. Keravnou (Cyprus) 15. 25 A framework for building cooperating agents G. Lanzola, M. Campagnoli, S. Falasconi and M. Stefanelli (Italy) 15. 50 Adding knowledge to information retrieval systems in the world wide web G. Mann, M. Schubert and V. Schaeffler (Germany) 16. 15 Learning from data through the integration of qualitative models and fuzzy systems R. Bellazzi, L. Ironi, R. Guglielmann, and M. Stefanelli (Italy) 16. 40 COFFEE BREAK 17. 00 PANEL SESSION 18. 00 CLOSING CEREMONY __________________________________ CONFERENCE VENUE The SIXTH CONFERENCE ON ARTIFICIAL INTELLIGENCE IN MEDICINE EUROPE will be held at World Trade Center, a building located in the center of the city of Grenoble, close to the railway station and shuttle station from Lyon and Grenoble airports and from walking distance from the major hotels and restaurants. Cloakroom, telephone and FAX facilities are available on the meeting site. CONFERENCE SECRETARIAT DESK At the Registration Desk, SELECTOUR CONGRES will operate all travel arrangements : Reservation, Modification, Ticketing, Shuttles and Taxis to airports, Car rental and all off-conference travel services. Hours: Sunday 23rd 08.00 - 13.00 14.00 - 18.00 Monday 24 th 08.00 - 13.00 14.00 - 18.00 Tuesday 25th 08.00 - 13.00 14.00 - 18.00 Wednesday 26th 08.00 - 13.00 14.00 - 18.00 REGISTRATION Please use the enclosed REGISTRATION FORM. Students should attach a letter of support from their supervisor or institution confirming student status. The registration fees include: * 3 days participation and Proceeding of the Conference * coffee breaks and lunches (Monday, Tuesday, Wednesday) * Opening Welcome Cocktail (Monday 24th) PRE-CONFERENCE TUTORIAL FEES Morning 1- Natural Language Processing 3- Evaluation of Artificial Intelligence Systems Afternoon 2- Protocols, guidelines and clinical decision support systems 4- Development, Evaluation, and Dissemination of Diagnostic Decision Support Systems Since there are two tutorial sessions in parallel, two tutorials at most can be selected (one in the morning, and one in the afternoon). SOCIAL EVENTS A Welcome Cocktail for all participants will be held on Monday Evening, March 24th, 19:00 hr, in the World Trade Center. The Conference Dinner will be held on Tuesday, March 25th, 20:30 hr at the " Chateau de Vizille ", a castle and a museum of the French Revolution, which is located about 30 km of Grenoble. A bus will leave from the conference site at 19:00 hr to allow a free visit of the museum. ACCOMMODATION Most hotels are within a walking distance of the World Trade Center conference building. Please use the enclosed ACCOMMODATION FORM. Average price per person, per night including breakfast : De Luxe Hotel 4**** FFr. 650/750 First Class 3*** FFr. 350/580 Economy 2** FFr. 250/350 PAYMENT Payment of the registration fee or of the accommodation deposit should be made in French francs net of all bank charges and commissions. Please indicate your name and address on any cheque or bank draft. More instructions are given below. CANCELLATION The Secretariat of the AIME'97 Conference must be notified in writing of any cancellation. A cancellation fee of FFr. 500, will be charged until 15 March 1997. After this date there will be no refund of registration fees. Nevertheless, any cancellation or change in participation schedule has to be notified to the Registration Desk. BANKING Change and banking facilities are in the immediate viccinity of the Conference site and open from 09:00 to 16:00 hr. International Credit Cards, Traveller Cheques and Eurocheques are widely accepted and quite practical. POSTERS The posters will be exposed during the whole duration of the Conference. 3 minutes oral presentation will be given for each poster during a plenary poster session. The poster exhibition room is open on Sunday 23 March 1997 for setting up and Wednesday 26 March 1997 for dismantling before 18:00 hr. Posters can be visited daily from 08:30 to 18:00 hr. Poster board : 1.85 m (h) x 0.89 m (l) SLIDES The slides have to be delivered to the SLIDE ROOM at the latest during the break preceding the session and at 08:30 hr for the first morning session. The SLIDE ROOM is located behind the AUDITORIUM. Dual projection is possible in all lecture rooms. __________________________________ REGISTRATION FORM Kindly complete in CAPITALS and mail to SELECTOUR-CONGRES BP 53- F-38041 GRENOBLE CEDEX 9- FRANCE AIME'97 6th Conference on Artificial Intelligence in Medicine Europe Family Name............................................ First Name............................................... Title....................................................... Organization............................................. Address.................................................. City....................................................... State...................................................... Country.................................................. Zip Code................................................. Telephone............................................... Fax....................................................... E-Mail................................................... ___________________________________________________________ Conference fees: * Before February 15th: FFr. 2 300 * After February 15th: FFr. 2 850 * Student before February 15th: FFr. 1 300 * Student after February 15th: FFr. 1 650 DEADLINE FOR REDUCED FEES EXTENDED TO THE 28TH OF FEBRUARY Tutorial fees: * 1 Tutorial: FFr. 500 * 2 Tutorials: FFr. 800 Morning 1 /_/ 2 /_/ Afternoon 3 /_/ 4 /_/ * Social Dinner FFr. 300 TOTAL OF REGISTRATION FEES : ______________________________________________________________ Payment: * By bank-to-bank transfer to to "CONGRES AIME 97 - R=E9gie de Recettes", T.P. GRENOBLE - TRESOR GLE, account number 10071 38000 00002000255/73 * By sending a certified cheque in French francs together with the registration form, drawn in a French bank to the order of "CONGRES AIME 97 - R=E9gie de Recettes". * By credit card Eurocard /_/ EC Mastercard /_/ MC Visa /_/ VC Charge to credit card n=B0 /_/_/_/_/ /_/_/_/_/ /_/_/_/_/ /_/_/_/_/ Expiration date: Name of credit card holder: Card holder signature: __________________________________________ ACCOMMODATION FORM Kindly complete in CAPITALS and mail to SELECTOUR-CONGRES BP 53- F-38041 GRENOBLE CEDEX 9- FRANCE AIME'97 6th Conference on Artificial Intelligence in Medicine Europe Family Name.............................................. First Name................................................. Title......................................................... Organization............................................... Address.................................................... City......................................................... State........................................................ Country.................................................... Zip Code................................................... Telephone.................................................. Fax.......................................................... E-Mail...................................................... Arrival by: Plane /_/ Train /_/ Car /_/ Date of arrival Date of departure Night Single room /_/ Double room /_/ Deposit to pay for booking : De Luxe Hotel 4**** FFr. 700 /_/ First Class 3*** FFr. 450 /_/ Economy 2** FFr. 300 /_/ In case the category you chose is not available please state second preference : higher category /_/ lower category /_/ Requests will be honored in the order received. A penalty of FFr. 100 will be charged for any modification. In case of cancellation after 15th March, the deposit will not be refunded. Payment: * By bank-to-bank transfer to " SELECTOUR-CONGRES ", Banque NEUFLIZE PARIS St GEORGE, compte n=B0 30788 00169 10002700108 * By sending a certified cheque in French francs together with the registration form, drown in a French bank to the order of " SELECTOUR-CONGRES ". * By credit card Eurocard /_/ EC Mastercard /_/ MC Visa /_/ VC Charge to credit card n=B0 /_/_/_/_/ /_/_/_/_/ /_/_/_/_/ /_/_/_/_/ Expiration date: Name of credit card holder: Card holder signature: __________________________________________ Catherine GARBAY Lab. TIMC - IMAG Institut Bonniot Facult=E9 de M=E9decine - Domaine de la Merci 38706 La Tronche - France T=E9l : 33 (0) 4 76 54 94 85 Fax : 33 (0) 4 76 54 95 49 URL: http://www-timc.imag.fr/sic ------------------------------ From: Marney Smyth Date: Sat, 1 Feb 1997 12:20:16 -0500 (EST) Subject: Learning Methods Tutorial -- Washington DC, May 1997 ************************************************************** *** *** *** Learning Methods for Prediction, Classification, *** *** Novelty Detection and Time Series Analysis *** *** *** *** Washington, D.C., May 2 -- 3, 1997 *** *** *** *** Geoffrey Hinton, University of Toronto *** *** Michael Jordan, Massachusetts Inst. of Tech. *** *** *** ************************************************************** A two-day intensive Tutorial on Advanced Learning Methods will be held on May 2nd and 3rd, 1997, at the Hyatt Regency on Capitol Hill, Washington D.C. Space is available for up to 50 participants for the course. The course will provide an in-depth discussion of the large collection of new tools that have become available in recent years for developing autonomous learning systems and for aiding in the analysis of complex multivariate data. These tools include neural networks, hidden Markov models, belief networks, decision trees, memory-based methods, as well as increasingly sophisticated combinations of these architectures. Applications include prediction, classification, fault detection, time series analysis, diagnosis, optimization, system identification and control, exploratory data analysis and many other problems in statistics, machine learning and data mining. The course will be devoted equally to the conceptual foundations of recent developments in machine learning and to the deployment of these tools in applied settings. Case studies will be described to show how learning systems can be developed in real-world settings. Architectures and algorithms will be presented in some detail, but with a minimum of mathematical formalism and with a focus on intuitive understanding. Emphasis will be placed on using machine methods as tools that can be combined to solve the problem at hand. WHO SHOULD ATTEND THIS COURSE? The course is intended for engineers, data analysts, scientists, managers and others who would like to understand the basic principles underlying learning systems. The focus will be on neural network models and related graphical models such as mixture models, hidden Markov models, Kalman filters and belief networks. No previous exposure to machine learning algorithms is necessary although a degree in engineering or science (or equivalent experience) is desirable. Those attending can expect to gain an understanding of the current state-of-the-art in machine learning and be in a position to make informed decisions about whether this technology is relevant to specific problems in their area of interest. COURSE OUTLINE Overview of learning systems; LMS, perceptrons and support vectors; generalized linear models; multilayer networks; recurrent networks; weight decay, regularization and committees; optimization methods; active learning; applications to prediction, classification and control Graphical models: Markov random fields and Bayesian belief networks; junction trees and probabilistic message passing; calculating most probable configurations; Boltzmann machines; influence diagrams; structure learning algorithms; applications to diagnosis, density estimation, novelty detection and sensitivity analysis Clustering; mixture models; mixtures of experts models; the EM algorithm; decision trees; hidden Markov models; variations on hidden Markov models; applications to prediction, classification and time series modeling Subspace methods; mixtures of principal component modules; factor analysis and its relation to PCA; Kalman filtering; switching mixtures of Kalman filters; tree-structured Kalman filters; applications to novelty detection and system identification Approximate methods: sampling methods, variational methods; graphical models with sigmoid units and noisy-OR units; factorial HMMs; the Helmholtz machine; computationally efficient upper and lower bounds for graphical models REGISTRATION Standard Registration: $700 Student Registration: $400 Cancellation Policy: Cancellation before Friday April 25th, 1997, incurs a penalty of $150.00. Cancellation after Friday April 25th, 1997, incurs a penalty of one-half of Registration Fee. Registration Fee includes Course Materials, breakfast, coffee breaks, and lunch. On-site Registration is possible. Payment of on-site registration must be in US Dollar amounts, by Money Order or Check (preferably drawn on a US Bank account). Those interested in participating should return the completed Registration Form and Fee as soon as possible, as the total number of places is limited by the size of the venue. Please print this form, and fill in the hard copy to return by mail REGISTRATION FORM Learning Methods for Prediction, Classification, Novelty Detection and Time Series Analysis Friday, May 2 - Saturday, May 3, 1997 Washington, D.C., USA. -------------------------------------- Please complete this form (type or print) Name ___________________________________________________ Last First Middle Firm or Institution ______________________________________ Standard Registration ____ Student Registration ____ Mailing Address (for receipt) _________________________ __________________________________________________________ __________________________________________________________ __________________________________________________________ Country Phone FAX __________________________________________________________ email address (Lunch Menu - tick as appropriate): ___ Vegetarian ___ Non-Vegetarian Fee payment must be made by MONEY ORDER or PERSONAL CHECK. All amounts are given in US dollar figures. Make fee payable to Prof. Michael Jordan. Mail it, together with this completed Registration Form to: Professor Michael Jordan Dept. of Brain and Cognitive Sciences M.I.T. E10-034D 77 Massachusetts Avenue Cambridge, MA 02139 USA HOTEL ACCOMMODATION Hotel accomodation is the personal responsibility of each participant. The Tutorial will be held in Hyatt Regency on Capitol Hill 400 New Jersey Avenue, NW Washington, DC 20001 1-800-233-1234 or (202) 737-1234 on May 2 -- 3, 1997. The hotel has reserved a block of rooms for participants of the course. The special room rates for participants are: U.S. $139.00 (Single/Double) per night + tax You must reserve accommodation before *April 1, 1997* to avail of this special rate. Please be aware that these prices do not include State or City taxes. ADDITIONAL INFORMATION A registration form is available from the course's WWW page at http://www.ai.mit.edu/projects/cbcl/web-pis/jordan/course/ Marney Smyth E-mail: marney@ai.mit.edu Phone: 617 258-8928 Fax: 617 258-6779 From payman@u.washington.edu Mon Feb 24 23:55:17 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id XAA02835 for ; Mon, 24 Feb 1997 23:55:12 -0600 Received: from TELNET-1.SRV.CS.CMU.EDU (TELNET-1.SRV.CS.CMU.EDU [128.2.254.108]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id XAA26153 for ; Mon, 24 Feb 1997 23:55:10 -0600 (CST) Received: from TELNET-1.SRV.CS.CMU.EDU by telnet-1.srv.cs.CMU.EDU id aa06994; 25 Feb 97 0:18:04 EST Received: from DST.BOLTZ.CS.CMU.EDU by TELNET-1.SRV.CS.CMU.EDU id aa06987; 24 Feb 97 23:59:50 EST Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa09606; 24 Feb 97 23:59:03 EST Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa08843; 24 Feb 97 20:36:06 EST Received: from jason05.u.washington.edu by EDRC.CMU.EDU id aa26521; 24 Feb 97 20:35:49 EST Received: from saul7.u.washington.edu (saul7.u.washington.edu [140.142.82.2]) by jason05.u.washington.edu (8.8.4+UW96.12/8.8.4+UW96.12) with ESMTP id RAA14538 for ; Mon, 24 Feb 1997 17:32:44 -0800 Received: (from payman@localhost) by saul7.u.washington.edu (8.8.4+UW96.12/8.8.4+UW96.12) id RAA01799 for connectionists@cs.cmu.edu; Mon, 24 Feb 1997 17:35:45 -0800 (PST) Date: Mon, 24 Feb 1997 17:35:45 -0800 (PST) From: Payman Arabshahi Message-Id: <199702250135.RAA01799@saul7.u.washington.edu> To: connectionists@cs.cmu.edu Subject: CIFEr'97 Tutorials - New York, March 23, 1997 Computational Intelligence in Financial Engineering Conference CIFEr'97 March 23-25, 1997 Crowne Plaza Manhattan, New York City http://www.ieee.org/nnc/cifer97 Registration information: Barbara Klemm CIFEr'97 Secretariat Meeting Management 2603 Main Street, Suite # 690 Irvine, California 92714 Tel: (714) 752-8205 or (800) 321-6338 Fax: (714) 752-7444 Email: Meetingmgt@aol.com TUTORIALS ---------------------------------------------------------------------------- Risk Management Jan W. Dash, Ph.D. Director Quantitative Analysis Global Risk Management Smith Barney This tutorial will cover 1) characterization of risks in finance: market risk (interest rates, FX rates, equity indices, spreads), trading risk, systems risk (software, hardware, vendors), model risk, and 2) quantitative measurement of risk: the Greeks (Delta, Gamma, Vega), the partial Greeks (Ladders), the new Greeks (Exotics), dollars at risk (n-Sigma analysis), correlations, static scenario analysis, dynamic scenario analysis, Monte Carlo risk analysis, beginnings of risk standards, DPG, Risk Metrics, and 3) case study of risk: the Viacom CVR Options and 4) pricing and hedging for interest rate derivatives. ---------------------------------------------------------------------------- An Introduction to OTC Derivatives and Their Applications John F. Marshall, Ph.D. Executive Director International Association of Financial Engineers This tutorial is for persons with little prior exposure to derivative instruments. It will focus on the basic products, how they trade, and how they are used. It will be largely non-quantitative. The tutorial will examine how derivatives are used by financial engineers for risk management purposes, investment purposes, cash flow management, and creating structured securities. The use of derivatives to circumvent market imperfections, such as asymmetric taxes and transaction costs, will also be demonstrated. The primary emphasis of the tutorial will be swaps (including interest rate swaps, currency swaps, commodity swaps, equity swaps, and macroeconomic swaps). Applications of OTC options, including caps and floors and digital options will also be examined, but to a lesser extent. ---------------------------------------------------------------------------- GARCH Modeling of Financial Time Series R. Douglas Martin, Ph.D. Professor of Statistics, University of Washington Chief Scientist, Data Analysis Products Division of MathSoft, Inc. This tutorial provides an introduction to univariate and multivariate generalized autoregressive heteroscedastic (GARCH) modeling of financial returns time series data, with a focus on modeling conditional volatilities and correlations. Basic aspects of the various models are discussed, including: conditions for stationarity, optimization techniques for maximum likelihood estimation of the models, use of the estimated conditional standard deviations for value-at-risk calculations and options pricing, use of conditional correlations in obtaining conditional volatilities for portfolios. Examples are provided using the S+GARCH object-oriented toolkit for GARCH modeling. ---------------------------------------------------------------------------- Time Series Tools for Finance Andreas Wiegend, Ph.D. Professor, Stern School of Business, New York University This tutorial presents a unifying view of the recent advances of neuro-fuzzy, and other machine learning techniques for time series and finance. It is given jointly by Prof. Andreas Wiegend (Stern School of Business, NYU), and Dr. Georg Zimmerman (Siemens AG, Munich), and presents both conceptual aspects of time series modeling, specific tricks for financial engineering problems, and software engineering aspects for building a trading system. ---------------------------------------------------------------------------- An Introduction to Evolutionary Computation David B. Fogel, PhD Chief Scientist, Natural Selection, Inc., La Jolla Evolutionary computation encompasses a broad field of optimization algorithms that can be applied to diverse, difficult real-world problems. It is particularly useful in addressing stochastic, nonlinear, and time-varying optimization problems, including those arising in financial engineering. This tutorial will provide background on the inspiration, history, and the practical application of evolutionary computation to problems typical of those encountered in financial engineering. ---------------------------------------------------------------------------- Models for Stochastic Volatility: Some Recent Developments Nuno Cato Professor, New Jersey Institute of Technology, Newark Pedro J. F. de Lima Professor, The Johns Hopkins University, Baltimore In this tutorial, we will firstly discuss the importance of modeling stock market's volatility. Secondly, we will review the basic properties of GARCH- type and SV-type models and some of their most successful extensions, namely the SWitching ARCH (SWARCH) models. The performance of these models will be illustrated with some real data examples. Thirdly, we will discuss some problems with the estimation of these models and with their use for risk forecasting. Fourthly, we will describe some recent research and some novel extensions to these models, such as the Long-Memory Stochastic Volatility (LMSV) and the SWitching Stochastic Volatility (SWSV) models. By using examples from recent stock market behavior we illustrate the capabilities and shortcomings of these new modeling and forecasting tools. ---------------------------------------------------------------------------- From rheath@hiplab.newcastle.edu.au Tue Feb 25 06:41:37 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id GAA04999 for ; Tue, 25 Feb 1997 06:41:13 -0600 Received: from cs.uwa.oz.au (bilby.cs.uwa.oz.au [130.95.1.11]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id GAA00589 for ; Tue, 25 Feb 1997 06:41:04 -0600 (CST) Received: from (parma.cs.uwa.oz.au [130.95.1.7]) by cs.uwa.oz.au (8.6.8/8.5) with SMTP id RAA00945; Tue, 25 Feb 1997 17:33:11 +0800 Message-Id: <199702250933.RAA00945@cs.uwa.oz.au> From: Richard Heath To: USA CogSci , genetic-programming@cs.stanford.edu, lcq@spc.dcs.tsinghua.edu.cn, neuropl@plearn.edu.pl, neuropsych@mailbase.ac.uk, news-announce-conferences@uunet.uu.net, psicopatologia@vortex.ufrgs.br, psy-language@netcom.com, psycgrad@acadvm1.uottawa.ca, psyche-d@rfmh.org, psyqnga@herts.ac.uk, quantitative-mri@mailbase.ac.uk, reinforce@cs.uwa.edu.au, sdaviss@umabnet.ab.umd.edu, sigart@vaxa.isi.edu, slling-l@yalevm.ycc.yale.edu, socio-moral@vortex.ufrgs.br, uai@ghost.CS.ORST.EDU, vision-list@teleos.com Subject: Fourth Australasian Cognitive Science Conference Date: Tue, 25 Feb 1997 11:14:21 +1100 (EST) Please distribute widely to your colleagues. First Call for Papers Fourth Conference of the Australasian Cognitive Science Society September 26-28, 1997 University of Newcastle NSW AUSTRALIA You are cordially invited to attend the Fourth Biennial Conference of the Australasian Cognitive Science Society which will be held at the University of Newcastle, New South Wales, Australia from 26th - 28th September 1997. This conference involves multidisciplinary participation from a broad spectrum of researchers interested in cognitive processes and artificial intelligence from a variety of cognate disciplines, such as Psychology, Philosophy, Computer Science, Neuroscience, Engineering and Human Factors. The Conference emphasises the common interest among these disciplines towards advancing our knowledge of brain/mind function. The Conference will be held in the David Maddison Building of the Faculty of Medicine situated in downtown Newcastle, within five minutes walking distance from shops, restaurants and of course the lovely harbour and surfing beaches, for which Newcastle is famous. Newcastle is situated some 165 km (100 miles) north of Sydney, to which it is connected by freeway (about 2 hours drive), frequent train service and commuter air services. Several modern hotels/motels are located within easy walking distance from the Conference venue and delegates should consider extending their stay by visiting the nearby Hunter Valley vineyards and the picturesque tourist areas at Lake Macquarie and Port Stephens. The conference will consist of Invited speakers, including Professor Michael Arbib from the University of Southern California and Professor William Bechtel from Washington University in St Louis, applied symposia, oral presentations and poster sessions. Delegates can submit either a poster Abstract, or a longer submission for oral presentation. The latter should consist of a completed paper ready for publication in the Conference Proceedings. This paper must not exceed six pages in length, including references and figures and must be submitted in electronic form using any one of the popular word processing packages. The submission should be single spaced and contain just one column per page. Latex/Tex files should be converted to postscript and submitted in that form. Submitted papers should contain: (i) Title (ii) Author(s) and affiliation/address (iii) email address (iv) a short abstract containing no more than 150 words (v) no more than six pages of single-spaced single-column text (vi) tables and figures embedded within the text, not submitted separately. Each paper will be peer reviewed by two reviewers from different fields of Cognitive Science, in order to ensure high quality and originality as well as multidisciplinary relevance. Authors should avoid unnecessary jargon that will reduce the paper's value to researchers from other fields of Cognitive Science. All accepted papers and poster abstracts will be published on a CD-ROM which will be included in the package of information provided upon arrival. Authors can also include software, demonstrations and other hypermedia materials to support their paper. These material will be stored on the CD-ROM if sufficient space is available. Authors can select the font and page size that suits their writing style. The preferred font is Times-Roman 12pt. It is important that papers requiring revision be received by the Conference Technical Committee before the deadline for production of the CD-ROM. Hypermedia supplementary files should also be submitted by this deadline. Papers not resubmitted in time will only have their abstract published and will revert to poster presentations. Authors should submit their papers and/or poster abstracts in electronic form by email to: Associate Professor Richard Heath rheath@HIPLAB.NEWCASTLE.EDU.AU no later than 16th May 1997. Following review, all resubmitted papers will be due no later than 15th August, 1997. Further inquiries about the conference, including suggestions for Symposia and other aspects of the Conference organisation, should be directed to Professor Heath. Registration Fees (in Australian dollars, A$ = 0.77US$), which include morning and afternoon teas and the Conference CDROM, are: $155 (before 31st July 1997) $70 Students (before 31st July 1997) $175 (after 31st July 1997) $80 students (after 31st July 1997) The Conference Dinner will be held at the newly renovated and historic Customs House restaurant and will be held on Friday 26th September, 1997. The cost is $60 per person for Entree, Main course, Dessert and drinks. If you have any special dietary requests please indicate these on the Registration Form. Cheques/Money Orders payable to Cognitive Science Conference 1997 in Australian dollars should be sent to: Dr Andrew Heathcote Treasurer, Fourth Conference of the Australasian Cognitive Science Society Department of Psychology University of Newcastle University Drive CALLAGHAN NSW 2308 AUSTRALIA The Conference will be preceded by a full day Symposium on Dynamical Models of Mind, Their Nature, Relations to Computational and Other Models, and Future, to be held on 25th September 1997. This issue is currently 'hot' and with widespread ramifications. Whether the Symposium proceeds separately, or is simply incorporated into the conference meetings proper, will depend on the interest expressed. It is intended that the Symposium consist of invited speakers combined with general discussion of the issues. Expressions of interest can take the form of offers of papers (accepted papers, necessarily very limited, will appear in the Conference Proceedings, as above), offers of posters for a poster session, and offers to attend (simplicitur). Further information can be obtained from Professor Hooker on PLCAH@cc.newcastle.edu.au The Conference Organising Committee are: Associate Professor Richard Heath (Psychology) Professor Cliff Hooker (Philosophy) Dr Andrew Heathcote (Psychology) Dr Brett Hayes (Psychology) Associate Professor Graham Wrightson (Computer Science) Dr Bruce Penfold (Electrical Engineering) Dr Peter Pfister (Aviation/Psychology) REGISTRATION FORM Fourth Conference of the Australasian Cognitive Science Society Name: ..................................................................... Title(Dr etc): ........................................... Preferred Name for Badge: ................................................ Mailing Address: Telephone Number: Fax Number: Email address: Please note that all correspondence, wherever possible, will be via email. Cognitive Science Conference Registration (including Society Membership valid until 1999 and a mcopy of the Conference Proceedings on CD-ROM) $155 (before 31st July 1997) ............................ $70 Students (before 31st July 1997) ............................ $175 (after 31st July 1997) ............................ $80 students (after 31st July 1997) ............................ (Students should be include a note on their University's letterhead indicating their status, in order to obtain the special Student rate) Dinner ......... persons @ $60 = .................... Dynamical Models of Mind Workshop I would be interested in attending the above Workshop on 25th September 1997 with Registration costs as follows: YES NO $50 (before 31st July 1997) ....................... $30 Students (before 31st July 1997) ....................... $60 (after 31st July 1997) ....................... $35 students (after 31st July 1997) ....................... If YES, indicate whether you would wish to present a Paper or Poster and send an abstract of 150 words for the Poster or a paper using the same format as for the Cognitive Science Conference. Paper ........... Poster .............. TOTAL .................... Please send a cheque for the above TOTAL amount payable to: Dr Andrew Heathcote Treasurer, Fourth Conference of the Australasian Cognitive Science Society Department of Psychology University of Newcastle, University Drive CALLAGHAN NSW 2308 AUSTRALIA If you are an overseas delegate, please send a bank cheque drawn on an Australian Bank for the above TOTAL amount in Australian dollars. Do you have any special dietary requirementsfor the Conference Dinner? Yes No If YES, please indicate what these are: Accommodation Requirements (Please rank in order of preference): Noahs on the Beach Standard $94.50 Single/Double $10 extra person .............. Ocean View $112.50 Single/Double .............. Radisson Hotel $119 with breakfast Single .............. $129 " " Twin/Double .............. Novocastrian $95 up to $145 for a suite .............. Other hotels and motels are available away from the beach area at reduced rates, approx $50 - $80 We will also endeavour to cater for students and others on limited budgets, if you can let us know your requirements in the space below. Please return this form by email to rheath@hiplab.newcastle.edu.au A World-Wide-Web page is under construction and a more detailed description of the Conference and Workshop will be presented there. From tlindroo@bennet.dc.tutech.fi Tue Feb 25 11:41:30 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id LAA11692 for ; Tue, 25 Feb 1997 11:41:25 -0600 Received: from TELNET-1.SRV.CS.CMU.EDU (TELNET-1.SRV.CS.CMU.EDU [128.2.254.108]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id LAA06764 for ; Tue, 25 Feb 1997 11:41:19 -0600 (CST) Received: from TELNET-1.SRV.CS.CMU.EDU by telnet-1.srv.cs.CMU.EDU id aa07791; 25 Feb 97 10:28:48 EST Received: from DST.BOLTZ.CS.CMU.EDU by TELNET-1.SRV.CS.CMU.EDU id aa07789; 25 Feb 97 10:17:17 EST Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa10110; 25 Feb 97 10:17:05 EST Received: from CS.CMU.EDU by B.GP.CS.CMU.EDU id aa14850; 25 Feb 97 4:30:09 EST Received: from bennet.dc.tutech.fi by CS.CMU.EDU id aa09860; 25 Feb 97 4:29:17 EST Received: (from tlindroo@localhost) by bennet.dc.tutech.fi (8.8.4/8.8.4) id LAA35528 for connectionists@cs.cmu.edu; Tue, 25 Feb 1997 11:31:38 +0200 Date: Tue, 25 Feb 1997 11:31:38 +0200 From: Tommi Lindroos Message-Id: <199702250931.LAA35528@bennet.dc.tutech.fi> To: connectionists@cs.cmu.edu Subject: EANN '97: Invitation for participation Sorry for this unsolicited mail. It is being sent to you because you are apparently working in or interested in the field of neural networks. Most of these addresses are taken from neural network societies, and participants of earlier EANN conferences. Please let us know if your address should not be in here. International Conference on Engineering Applications of Neural Networks (EANN '97) Royal Institute of Technology Stockholm, Sweden 16-18 June 1997 INVITATION FOR PARTICIPATION AND PROGRAM OUTLINE The International Conference on Engineering Applications of Neural Networks (EANN '97) is the third conference in the series. The two earlier ones were held near Helsinki in 1995 and in London in 1996. The conference is a forum for presenting the latest results on neural network applications in technical fields. Over a hundred papers from 27 countries have been accepted for oral and poster presentations after a review of the abstracts. Some more information on the conference EANN '97 is available on the world wide web site at http://www.abo.fi/~abulsari/EANN97.html, and on last two conferences at http://www.abo.fi/~abulsari/EANN95.html and EANN96.html Contact details E-mail address : eann97@kth.se Address : EANN '97 SEA PL 953, FIN 20101 Turku 10, Finland The conference has been organised with cooperation from Systems Engineering Association AB Nonlinear Solutions OY and Royal Institute of Technology Conference chairmen: Hans Liljenstrm and Abhay Bulsari Conference venue: Department of Numerical Analysis (NADA), Royal Institute of Technology (KTH), Osquars backe 2, 10044 Stockholm Registration information The conference fee will be SEK 4148 (SEK 3400 excluding VAT) until 28 February, and SEK 4978 (SEK 4080 excluding VAT) after that. The conference fee includes attendance to the conference and the proceedings. If your organisation (university or company or institute) has a VAT registration from a European Union country other than Finland, then your VAT number should be mentioned on the bank transfer as well as the registration form, and VAT need not be added to the conference fee. The correct conference fee amount should be received in the account number 207 799 342, Svenska Handelsbanken International, Stockholm branch. It can be paid by bank transfer, with all expenses paid by the sender, to "EANN Conference". To avoid extra bureaucracy and correction of the amount at the registration desk, make sure that you have taken care of the bank transfer fees. It is essential to mention the name of the participant with the bank transfer. If you need to pay it in another way (bank drafts, Eurocheques, postal order; no credit cards), please contact us at eann97@kth.se. Invoicing will cost SEK 100. The tentative program outline is as on the following page. The detailed program will be prepared in the end of April. PROGRAM OUTLINE Sunday, 15 June 1600-1800 Registration Room E1 Room E2 Monday, 16 June 0830 Opening 0845 Vision (1) Control Systems (1) 1200 --- lunch break --- 1330 Vision (2) Control Systems (2) 1630 Discussion session Discussion session on Control on Vision Tuesday, 17 June 0830 Process Engineering Biomedical Engineering 1130 --- lunch break --- 1300 Metallurgy Mechanical Engineering 1530 Industrial Panel Discussion Wednesday, 18 June 0830 Hybrid systems Special applications 1130 --- lunch break --- 1300 Electrical/Electronics General Applications 1800 Closing In addition, there will be a poster session on Wednesday, and possibly an evening program on Tuesday. The indicated times are approximate and changes are still possible. Coffee breaks are not indicated here. International Conference on Engineering Applications of Neural Networks (EANN '97) Stockholm, Sweden 16-18 June 1997 Registration form Surname First Name Affiliation (name of the university/company/organisation) E-mail address Postal Address City Country Fax Have you submitted one or more abstracts ? Y/N Abstract number(s) Registration fee sent (amount) SEK ____________ by bank transfer number _______________ from Bank ______________ VAT registration number Date registration fee sent Date registration form sent Any special requirements ? --- END --- From karaali@ukraine.corp.mot.com Tue Feb 25 18:19:13 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id SAA20427 for ; Tue, 25 Feb 1997 18:19:08 -0600 Received: from TELNET-1.SRV.CS.CMU.EDU (TELNET-1.SRV.CS.CMU.EDU [128.2.254.108]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id SAA16131 for ; Tue, 25 Feb 1997 18:19:01 -0600 (CST) Received: from TELNET-1.SRV.CS.CMU.EDU by telnet-1.srv.cs.CMU.EDU id aa08189; 25 Feb 97 16:00:17 EST Received: from DST.BOLTZ.CS.CMU.EDU by TELNET-1.SRV.CS.CMU.EDU id aa08187; 25 Feb 97 15:46:29 EST Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa10332; 25 Feb 97 15:45:54 EST Received: from RI.CMU.EDU by B.GP.CS.CMU.EDU id aa24768; 25 Feb 97 15:27:15 EST Received: from motgate.mot.com by RI.CMU.EDU id aa10273; 25 Feb 97 15:24:54 EST Received: from pobox.mot.com (pobox.mot.com [129.188.137.100]) by motgate.mot.com (8.7.6/8.6.10/MOT-3.8) with ESMTP id OAA14595 for ; Tue, 25 Feb 1997 14:24:51 -0600 (CST) Received: from ukraine.corp.mot.com (ukraine.ccrl.mot.com [182.1.83.20]) by pobox.mot.com (8.7.6/8.6.10/MOT-3.8) with SMTP id OAA02835 for ; Tue, 25 Feb 1997 14:24:48 -0600 (CST) Received: from fiji.mot.com by ukraine.corp.mot.com (4.1/SMI-4.1) id AA15668; Tue, 25 Feb 97 14:24:45 CST Received: by fiji.mot.com (SMI-8.6/SMI-SVR4) id OAA18339; Tue, 25 Feb 1997 14:24:44 -0600 Date: Tue, 25 Feb 1997 14:24:44 -0600 From: Orhan Karaali Message-Id: <199702252024.OAA18339@fiji.mot.com> To: Connectionists@cs.cmu.edu Subject: Neural Network Intern For Speech Recognition X-Sun-Charset: US-ASCII NEURAL NETWORK INTERN FOR SPEECH RECOGNITION Motorola's Chicago Corporate Research Laboratories is currently seeking a Ph.D. student to join the Speech Synthesis and Machine Learning Group as an intern in its Speech Processing Systems Research Laboratory in Schaumburg, Illinois, for spring 1997. The internship will last at least three months. The Speech Synthesis and Machine Learning Group has developed innovative neural network and signal processing technologies for speech synthesis and speech recognition applications. The intern will work on a component of an HMM/neural network hybrid speech recognizer. The duties of the position include applied research, software development, and conducting experiments with speech data sets. Innovation in research, application of technology and a high level of motivation is the standard for all members of the team. The individual should be in an advanced stage of a Ph.D. program in EE, CS or a related discipline. The ability to work within a group to quickly implement and evaluate algorithms in a rapid research/development cycle is essential. Strong programming skills in the "C" language and solid knowledge of neural networks are required. Background in any of speech processing, statistical techniques, decision trees, and genetic algorithms is highly desirable. Please send text-format resume and cover letter to Orhan Karaali, karaali@mot.com. Motorola is an equal opportunity/affirmative action employer. We welcome and encourage diversity in our workforce. From langley@mail.RTNA.DaimlerBenz.COM Wed Feb 26 03:14:51 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id DAA03967 for ; Wed, 26 Feb 1997 03:14:28 -0600 Received: from cs.uwa.oz.au (bilby.cs.uwa.oz.au [130.95.1.11]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id DAA27693 for ; Wed, 26 Feb 1997 03:14:26 -0600 (CST) Received: from (parma.cs.uwa.oz.au [130.95.1.7]) by cs.uwa.oz.au (8.6.8/8.5) with SMTP id PAA03532; Wed, 26 Feb 1997 15:33:03 +0800 Message-Id: <199702260733.PAA03532@cs.uwa.oz.au> From: Pat Langley To: cogling@ucsd.edu, psyling@psy.gla.ac.uk, childes@cmu.edu, mpsych-l@brownvm.brown.edu, neuron-request@cattell.psych.upenn.edu, neuropl@plearn.edu.pl, neuropsych-request@mailbase.ac.uk, news-announce-conferences@uunet.uu.net, psicopatologia@vortex.ufrgs.br, psy-language@netcom.com.psyche-d@rfmh.org, psyc@pucc.princeton.edu, psycgrad@acadvm1.uottawa.ca, frieze@vms.cis.pitt.edu, quantitative-mri@mailbase.ac.uk, reinforce@cs.uwa.edu.au, sdaviss@umabnet.ab.umd.edu, sigart@vaxa.isi.edu, slling-l@yalevm.ycc.yale.edu, socio-moral@vortex.ufrgs.br, uai@ghost.CS.ORST.EDU, vision-list@teleos.com, lcq@spc.dcs.tsinghua.edu.cn, psychology.of.science@umich.edu, psyqnga@herts.ac.uk, brennavm@ctrvax.vanderbilt.edu Subject: abstract deadline for CogSci97 Date: Tue, 25 Feb 1997 12:30:16 -0800 (PST) Although the deadline for full-paper submissions to the 1997 Cognitive Science Conference is now past, you still have until Tuesday, March 4, to deliver a hardcopy one-page abstract. These submissions must follow the instructions for authors available at the conference web page: http://www-csli.stanford.edu/cogsci97 Authors of abstracts are guaranteed one page in the proceedings and a poster at the conference, but on one condition: the presenter must be a member of the Cognitive Science Society or they must join when the send in their final, camera-ready copy on April 29. More information about the Society is available at: http://www.pitt.edu/~cogsci95 If you have already sent us a six-page paper, you do not need to submit a separate abstract. Even if your paper is not accepted, you will still have a chance to contribute an abstract to the proceedings. If you have any questions about abstracts or about the conference, please send email to cogsci97@csli.stanford.edu. From ejua71@tattoo.ed.ac.uk Wed Feb 26 15:46:36 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id PAA22753 for ; Wed, 26 Feb 1997 15:46:07 -0600 Received: from TELNET-1.SRV.CS.CMU.EDU (TELNET-1.SRV.CS.CMU.EDU [128.2.254.108]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id PAA13246 for ; Wed, 26 Feb 1997 15:46:02 -0600 (CST) Received: from TELNET-1.SRV.CS.CMU.EDU by telnet-1.srv.cs.CMU.EDU id aa09577; 26 Feb 97 15:29:28 EST Received: from DST.BOLTZ.CS.CMU.EDU by TELNET-1.SRV.CS.CMU.EDU id aa09574; 26 Feb 97 15:00:40 EST Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa11044; 26 Feb 97 15:00:19 EST Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa12830; 26 Feb 97 12:58:52 EST Received: from renko.ucs.ed.ac.uk by EDRC.CMU.EDU id aa09039; 26 Feb 97 12:58:10 EST Received: from tattoo.ed.ac.uk (tattoo.ed.ac.uk [129.215.166.10]) by renko.ucs.ed.ac.uk (8.6.13/8.6.12) with SMTP id RAA11564 for ; Wed, 26 Feb 1997 17:58:06 GMT From: J A Bullinaria Subject: Conference announcement: NCPW4 To: Connectionists@cs.cmu.edu Cc: j.bullinaria@psyc.bbk.ac.uk, ncpw4@psychol.ucl.ac.uk Date: Wed, 26 Feb 97 17:57:35 GMT Message-ID: <9702261757.ab24625@uk.ac.ed.tattoo> 4th Neural Computation and Psychology Workshop Connectionist Representations : Theory and Practice University of London, England Wednesday 9th April - Friday 11th April 1997 We have recently solicited, recieved, reviewed and accepted 36 abstracts for presentation as talks at this workshop. We now invite additional participants to register, attend and listen to these talks. AIMS AND OBJECTIVES This workshop is the fourth in a series, following on from the first at the University of Wales, Bangor (with theme "Neurodynamics and Psychology"), the second at the University of Edinburgh, Scotland ("Memory and Language") and the third at the University of Stirling, Scotland ("Perception"). The general aim is to bring together researchers from such diverse disciplines as artificial intelligence, applied mathematics, cognitive science, computer science, neurobiology, philosophy and psychology to discuss their work on the connectionist modelling of psychology. This years workshop is being hosted jointly by members of the Psychology Departments of Birkbeck College London and University College London. As in previous years there will be a theme to the workshop. We think that this years theme is sufficiently wide ranging and important that researchers in all areas of Neural Computation and Psychology will find it relevant and have something to say on the subject. The theme is to be: "Connectionist Representations : Theory and Practice". This covers many important issues ranging from the philosophical (such as the grounding problem) to the physiological (what can connectionist representations tell us about real neural systems) to the technical (such as what is necessary to get specific models to work). As in previous years we aim to keep the workshop fairly small, informal and single track. As always, participants bringing expertise from outside the UK are particularly welcome. SPEAKERS INCLUDE: Roland Baddeley (Oxford) Dennis Norris (APU Cambridge) Tony Browne (Mid Kent) Mike Page (APU Cambridge) Neil Burgess (UCL) Malti Patel (Sydney, Australia) Morten Christiansen (S. Calif.) Tim Shallice (UCL) Robert French (Liege, Belgium) Leslie Smith (Stirling) Peter Hancock (Stirling) John G. Taylor (King's London) Glyn Humphreys (Birmingham) Chris Thornton (Sussex) Geoff Goodhill (Georgetown, DC) Janet Vousden (Warwick) Our web site has a complete listing of all the talks and abstracts. REGISTRATION, FOOD AND ACCOMMODATION The workshop will be held in University College London, which is situated in the centre of London, near the British Museum and within easy walking distance of the West End and many of London's major attractions. The conference registration fee (which includes lunch and morning and afternoon tea/coffee each day) is 60 pounds. A special conference dinner (optional) is planned for the Thursday evening costing 20 pounds. Accommodation can be arranged in student residences or in local hotels, according to budget. The conference/accommodation area is easily accessible by the London Underground system ("The Tube"), with direct lines from London Heathrow Airport and all the major intercity train stations. Full registration and accommodation information is available at the conference web site: "http://prospero.psychol.ucl.ac.uk/ncpw4/". ORGANISING COMMITTEE John Bullinaria (Birkbeck College London) Dave Glasspool (University College London) George Houghton (University College London) CONTACT DETAILS Workshop email address for all correspondence: ncpw4@psychol.ucl.ac.uk Workshop web page: http://prospero.psychol.ucl.ac.uk/ncpw4/ John Bullinaria, NCPW4, Centre for Speech and Language, Department of Psychology, Birkbeck College, Malet Street, London WC1E 7HX, UK. Phone: +44 171 631 6330, Fax: +44 171 631 6587 Email: j.bullinaria@psyc.bbk.ac.uk Dave Glasspool, NCPW4, Department of Psychology, University College London, Gower Street, London WC1E 6BT, UK. Phone: +44 171 380 7777 Xtn. 5418. Fax: +44 171 436 4276 Email: d.glasspool@psychol.ucl.ac.uk George Houghton, NCPW4, Department of Psychology, University College London, Gower Street, London WC1E 6BT, UK. Phone: +44 171 380 7777 Xtn. 5394. Fax: +44 171 436 4276 Email: g.houghton@psychol.ucl.ac.uk From koza@cs.stanford.edu Thu Feb 27 00:23:03 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id AAA03117 for ; Thu, 27 Feb 1997 00:22:58 -0600 Received: from cs.uwa.oz.au (bilby.cs.uwa.oz.au [130.95.1.11]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id AAA23688 for ; Thu, 27 Feb 1997 00:22:55 -0600 (CST) Received: from (parma.cs.uwa.oz.au [130.95.1.7]) by cs.uwa.oz.au (8.6.8/8.5) with SMTP id NAA22951; Thu, 27 Feb 1997 13:21:27 +0800 Message-Id: <199702270521.NAA22951@cs.uwa.oz.au> From: "John R. Koza" To: Reinforce@cs.uwa.edu.au Subject: GP-97 Late-Breaking Papers Call Date: Wed, 26 Feb 1997 20:01:55 -0800 (PST) CALL (Version 1.0) FOR LATE-BREAKING PAPERS FOR GENETIC PROGRAMMING 1997 CONFERENCE (GP-97) ------------------------------------------------- DEADLINE: Wednesday, June 11, 1997 ------------------------------------------------- Papers describing late-breaking developments in the field of genetic programming are being solicited for inclusion in a special paper-bound book to be distributed to all attendees of the Genetic Programming 1997 Conference (GP-97) to be held on July 13 - 16 (Sunday - Wednesday), 1997 at Stanford University. This special book is distinct from the conference proceedings. The purpose of late-breaking papers is to provide conference attendees with information about research that was initiated, enhanced, improved, or completed after the original paper submission deadline in January, 1997. Late-breaking papers will be presented during a poster session to be held on the evening of Monday, July 14, 1997 during the GP-97 conference at Stanford University. Arrangements will also be made to enable interested persons to purchase this book from the Stanford Book Store after the conference. Late-breaking papers will be briefly examined for relevance and minimum standards of acceptability, but will not be peer reviewed in detail. Authors will individually retain copyright (and all other rights) to their late-breaking papers and should feel free to submit them (either before or after the above deadline) for publicaton by other conferences or journals. Late-breaking papers must be submitted in camera-ready form in accordance with the GP-97 format specifications that can be found at the GP-97 WWW site (see below). Late-breaking papers should be no more than 9 pages in length. Please send TWO camera-ready copies (printed with very high quality by laser printer) and the SIGNED "permission to publish" form (below) to GP-97 Late-Breaking Papers American Association for Artificial Intelligence 445 Burgess Drive Menlo Park, CA 94025.USA PHONE: 415-328-3123 ------------------------------------------------------ For additional information on GP-97 conference... - on the World Wide Web: http://www-cs-faculty.stanford.edu/~koza/gp97.html - via e-mail at gp@aaai.org ------------------------------------------------------ In cooperation with American Association for Artificial Intelligence (AAAI), Association for Computing Machinery (ACM), SIGART, and Society for Industrial and Applied Mathematics (SIAM) ------------------------------------------------------ PERMISSION TO PUBLISH FORM For Late-Breaking Papers at the GP-97 Conference Title of Paper: ____________________________ Author(s): _______________________________ The undersigned (hereinafter the "Author"), desiring that the paper identified above (hereinafter the "Paper") appear in a publication tentatively entitled "Late-Breaking Papers at the Genetic Programming 1997 Conference" and to be edited by John R. Koza, hereby grants non-exclusive permission to Genetic Programming Conferences Inc., a California non-profit corporation (hereinafter "GPCI"), to prepare and print the Paper, for sale throughout the world, in this publication. The Author retains copyright, right to transfer the copyright to other parties is in the future, right to use any and all portions of the Paper in future publications by the Author, all proprietary rights (patent rights, etc.), and all other rights. The Author assigns copyright to GPCI for publication of the Paper. The Author warrants that he/she is the author and/or proprietor of the Paper; that he/she has full power to make this agreement; that the Paper does not infringe upon any copyright, trademark, or patent; and that he/she has not granted or assigned any rights on the Paper to any person or entity that would interfere with this grant of permission. Authorized Signature: _____________________ Printed Name of Signer: ___________________ Date: _______________ Address: _______________________________ _______________________________________ _______________________________________ From amilcar@dei.uc.pt Thu Feb 27 00:23:37 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id AAA03121 for ; Thu, 27 Feb 1997 00:23:32 -0600 Received: from cs.uwa.oz.au (bilby.cs.uwa.oz.au [130.95.1.11]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id AAA23697 for ; Thu, 27 Feb 1997 00:23:25 -0600 (CST) Received: from (parma.cs.uwa.oz.au [130.95.1.7]) by cs.uwa.oz.au (8.6.8/8.5) with SMTP id NAA22940; Thu, 27 Feb 1997 13:17:57 +0800 Message-Id: <199702270517.NAA22940@cs.uwa.oz.au> From: amilcar@dei.uc.pt (Amilcar Cardoso) To: addistr@sis.port.ac.uk, Afcet.Administration@ibp.fr, ai-stats@watstat.uwaterloo.ca, alife@cognet.ucla.edu, annrules@fit.qut.edu.au, cogpsy@neuro.psy.soton.ac.uk, colt@cs.uiuc.edu (COLT list), connectionists@cs.cmu.edu, dietmar@cognition.iig.uni-freiburg.de (European CBR Newsletter), genetic-programming@cs.stanford.edu, hybrid-list@cs.ua.edu, ml@ics.uci.edu, mlnet@csd.abdn.ac.uk (MLnet Admin), nat@ura1507.univ-paris13.fr, neuron-request@CATTELL.PSYCH.UPENN.EDU, nl-kr@snyside.sunnyside.com (comp.ai.nlang-know-rep), nonlin-l@list.nih.gov, reinforce@cs.uwa.edu.au (Reinforcement List), scivw-request@hitl.washington.edu (sci.virtual-worlds), www@sigart.acm.org (SIGART), webmaster@ai.iit.nrc.ca Cc: ernesto@dei.uc.pt Subject: EPIA'97 - 2nd CfP [connectionists] Date: Wed, 26 Feb 1997 19:03:53 GMT This CfP was sent to several mailing-lists. Please accept my apologies if you receive multiple copies. Amilcar Cardoso ---------------------------------------------------------------- Please pay atention: deadline for submissions is March, 17! ---------------------------------------------------------------- 2nd. CALL FOR PAPERS EPIA'97 8th Portuguese Conference on Artificial Intelligence Coimbra, Portugal October 6-9, 1997 http://alma.uc.pt/~epia97/ Under the auspices of the Portuguese Association for Artificial Intelligence The EPIA'97 Program Committee invites submissions of technical papers for the 8th Portuguese Conference on Artificial Intelligence which is to be held in Coimbra, Portugal, by October 6-9, 1997. As in previous issues ('89, '91, '93 and '95), EPIA'97 will be run as an international conference, English being the official language. The scientific program encompasses tutorials, invited lectures, parallel workshops, and paper presentations. Eight well-known researchers will present invited lectures and tutorials. The conference is devoted to all areas of Artificial Intelligence and will cover both theoretical and foundational issues and applications as well. ************************************************************** INVITED LECTURES Tom Mitchell Oskar Dressler (CMU - USA) (OCC'M Software GmbH - Germany) Francisco Varela Luis Moniz Pereira (CNRS - France) (UNL - Portugal) ************************************************************** TUTORIAL PROGRAM Daniel O'Leary (USC - USA): AI and Finance Ramon de Mantaras (CSIC - Spain): KDD and Data Mining Pedro Barahona (UNL - Portugal): Constraint Programming Felix Costa (UL - Portugal): Neurocomputing ************************************************************** TIMETABLE Submission Deadline: March 17, 1997 Notification of Acceptance or Rejection: May 19, 1997 Camera-Ready Copy: June 16, 1997 CONTENT AREAS Original papers are solicited in all areas of Artificial Intelligence, including but not limited to: o Agent-Oriented Programming o Automated Reasoning o Artificial Life o Belief Revision o Case-Based Reasoning o Common Sense Reasoning o Constraint Programming o Distributed AI o Expert Systems o Genetic Algorithms o Hybrid Systems o Intelligent Tutoring Systems o Knowledge Representation o Logic Programming o Machine Learning o Model-Based Reasoning o Natural Language Understanding o Neural Networks o Nonmonotonic Reasoning o Planning and Scheduling o Qualitative Reasoning o Robotics o Spatial Reasoning o Temporal Reasoning o Theorem Proving o Theory of Computation PROCEEDINGS As with previous conferences, proceedings will be published by Springer-Verlag in their Lecture Notes in Artificial Intelligence Series and will be available to participants. SUBMISSIONS Authors must submit five (5) completed printed copies of their papers to the EPIA'97 Submission Address (see below). Fax or electronic submissions will not be accepted. Detailed instructions on the procedure and on the format are available at the EPIA'97 Web site. ELECTRONIC ABSTRACT In addition to submitting the paper copies, authors should send a short (200 words) electronic abstract of their paper to aid the reviewing process. Detailed instructions on the procedure and on the format are available at the EPIA'97 Web site. REVIEW OF PAPERS Submissions will be judged on significance, originality, quality and clarity. Each paper will be cross-reviewed by three referees. Papers will be subject to blind peer review: reviewers will not be aware of the identities of the authors. This requires that authors exercise some care not to identify themselves in their papers. Detailed instructions are available at the EPIA'97 Web site. Submitted papers must report original and previously unpublished work. MAILING LIST We have set up an automated mailing list facility to easily distribute up to date information to those who wish to submit papers and/or attend the conference. You may find instructions on how to add yourself to the mailing list at the EPIA'97 Web site. PARALLEL WORKSHOPS Three workshops will run in parallel with the main stream of the conference: EKBD'97 - Extraction of Knowledge from Data Bases MASTA'97 - Multi-Agent Systems: Theory and Applications OR'97 - Outdoor Robotics Details about submissions and attendance are available at the EPIA'97 Web site. CONFERENCE CO-CHAIRS, PROGRAM CO-CHAIRS Ernesto Costa (ernesto@dei.uc.pt) Amilcar Cardoso (amilcar@dei.uc.pt) Universidade de Coimbra - Portugal LOCAL CHAIR Jose Luis Ferreira (jlf@dei.uc.pt) Universidade de Coimbra - Portugal PROGRAM COMMITTEE Bernardete Ribeiro (Portugal) Carlos Bento (Portugal) Carlos Pinto-Ferreira (Portugal) Cristiano Castelfranchi (Italy) Ernesto Morgado (Portugal) Eugenio Oliveira (Portugal) Gabriel Pereira Lopes (Portugal) Helder Araujo (Portugal) Helder Coelho (Portugal) John Self (UK) Larry Medsker (USA) Luis Moniz Pereira (Portugal) Luis Monteiro (Portugal) Manuela Veloso (USA) Miguel Filgueiras (Portugal) Nuno Mamede (Portugal) Oskar Dressler (Germany) Pavel Brazdil (Portugal) Pedro Barahona (Portugal) Philippe Dague (France) Ramon de Mantaras (Spain) Rosa Vicari (Brazil) Stefano Nolfi (Italy) Stuart Shapiro (USA) Takeo Kanade (USA) Xue Mei Wang (USA) Yves Kodratoff (France) SUBMISSION ADDRESS INQUIRES ADDRESS EPIA'97 Dep. Eng. Informatica Universidade de Coimbra - Polo II Pinhal de Marrocos 3030 Coimbra, Portugal Voice: +351 (39) 7000004 Fax: +351 (39) 701266 Email: epia97@alma.uc.pt (for inquires) epia_submissions@dei.uc.pt (for submission of the electronic abstract) URL: http://alma.uc.pt/~epia97/ OFFICIAL LANGUAGE English ************************************************************** WE ENVITE YOU TO VISIT THE EPIA'97 WEB SITE AT http://alma.uc.pt/~epia97 TO FIND ADDITIONAL INFORMATION ON THE CONFERENCE AND ON COIMBRA AND ITS HISTORICAL UNIVERSITY. ************************************************************** --------------------------------------------------------------------------- Dep. Eng. Informatica tel: + 351 39 7000 000 Universidade de Coimbra, Polo II fax: + 351 39 701 266 Pinhal de Marrocos email: amilcar@dei.uc.pt 3030 Coimbra - PORTUGAL From bishopc@helios.aston.ac.uk Thu Feb 27 19:28:18 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id TAA21030 for ; Thu, 27 Feb 1997 19:28:13 -0600 Received: from TELNET-1.SRV.CS.CMU.EDU (TELNET-1.SRV.CS.CMU.EDU [128.2.254.108]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id TAA14550 for ; Thu, 27 Feb 1997 19:28:11 -0600 (CST) Received: from TELNET-1.SRV.CS.CMU.EDU by telnet-1.srv.cs.CMU.EDU id aa11220; 27 Feb 97 18:57:11 EST Received: from DST.BOLTZ.CS.CMU.EDU by TELNET-1.SRV.CS.CMU.EDU id aa11218; 27 Feb 97 18:32:31 EST Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa11958; 27 Feb 97 18:32:07 EST Received: from RI.CMU.EDU by B.GP.CS.CMU.EDU id aa28152; 27 Feb 97 6:15:52 EST Received: from email.aston.ac.uk by RI.CMU.EDU id aa21876; 27 Feb 97 6:15:02 EST Received: from sun.aston.ac.uk (actually host fermat.aston.ac.uk) by email.aston.ac.uk with SMTP (PP); Thu, 27 Feb 1997 11:15:31 +0000 Message-Id: <3914.199702271009@sun.aston.ac.uk> X-Mailer: exmh version 2.0alpha 12/3/96 To: Connectionists@cs.cmu.edu cc: bishopc@helios.aston.ac.uk Subject: Isaac Newton Institute Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Date: Thu, 27 Feb 1997 10:09:52 +0000 From: "Prof. Chris Bishop" Isaac Newton Institute NEURAL NETWORKS AND MACHINE LEARNING A six month programme at the Isaac Newton Institute for Mathematical Sciences, Cambridge, U.K. July to December 1997 Organisers: C M Bishop (Aston), D Haussler (UCSC), G E Hinton (Toronto), M Niranjan (Cambridge), L G Valiant (Harvard) The Isaac Newton Institute for Mathematical Sciences is an international research centre, sponsoring visitor programmes in topics across the whole spectrum of mathematical research. This programme will be an important international event in the field of neural computing. At any one time there will be around 20 to 25 long-term participants, as well as larger numbers of short-term visitors. Seminars are open to all, although long-term participation will be by invitation. Three major conferences are planned to take place during the programme: 1) "Generalization in Neural Networks and Machine Learning" (4 to 15 August). This will be a NATO Advanced Study Institute. 2) "Probabilistic Graphical Models" (1 to 5 September). 3) "Bayesian Methods" (15 to 19 December). In addition, there will be a number of workshops, generally of one week duration. Provisional themes for these include the following: "Learning in computer vision" (6 to 10 October), "HMM hybrid models and protein/DNA modelling" (20 to 24 October), "Applications" (3 to 7 November), "Non-stationarity" (10 to 14 November), "Information geometry" (8 to 12 December). Information about the programme, and a list of participants, can be found at the programme web site: http://www.newton.cam.ac.uk/programs/nnm.html General information about the Newton Institute can be found at: http://www.newton.cam.ac.uk/ To be kept informed of future developments, you can subscribe to the programme information mailing list by sending an e-mail to majordomo@newton.cam.ac.uk with a message whose body contains the line subscribe nnm-list For programme-specific enquires please contact: Christopher M. Bishop Neural Computing Research Group Dept. of Computer Science and Applied Mathematics Aston University, Birmingham B4 7ET, U.K. Tel. +44/0 121 333 4631 Fax. +44/0 121 333 4586 C.M.Bishop@aston.ac.uk From mel@quake.usc.edu Fri Feb 28 00:23:27 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id AAA25341 for ; Fri, 28 Feb 1997 00:23:17 -0600 Received: from TELNET-1.SRV.CS.CMU.EDU (TELNET-1.SRV.CS.CMU.EDU [128.2.254.108]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id AAA19203 for ; Fri, 28 Feb 1997 00:23:16 -0600 (CST) Received: from TELNET-1.SRV.CS.CMU.EDU by telnet-1.srv.cs.CMU.EDU id aa11232; 27 Feb 97 19:06:25 EST Received: from DST.BOLTZ.CS.CMU.EDU by TELNET-1.SRV.CS.CMU.EDU id aa11222; 27 Feb 97 18:33:21 EST Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa11970; 27 Feb 97 18:33:01 EST Received: from CS.CMU.EDU by B.GP.CS.CMU.EDU id aa07164; 27 Feb 97 17:25:26 EST Received: from quake.usc.edu by CS.CMU.EDU id aa08774; 27 Feb 97 17:24:41 EST Received: by quake.usc.edu (5.0/8.6.4) id AA00232 for connectionists@cs.cmu.edu; Thu, 27 Feb 1997 14:20:18 +0800 Date: Thu, 27 Feb 1997 14:20:18 +0800 From: Bartlett Mel Message-Id: <9702272220.AA00232@quake.usc.edu> To: connectionists@cs.cmu.edu, incmembers@cogsci.uscd.edu, inc@ucsd.edu, cns-interest@cns.caltech.edu, bme-all@quake.usc.edu, stud.scholar.members@cogsci.ucsd.edu Subject: UCSD/USC Joint Symposium on Neural Computation CALL FOR PRESENTATIONS --- 4th Annual Joint Symposium on Neural Computation --- Co-sponsored by Institute for Neural Computation University of California, San Diego and Biomedical Engineering Department and Neuroscience Program University of Southern California to be hosted at The University of Southern California University Park Campus Saturday, May 17, 1997 9:00 a.m. to 6:00 p.m. In 1994, the Institute for Neural Computation at UCSD hosted the first Joint Symposium on Neural Computation with Caltech that brought together students and faculty for a day of short presentations. This year USC will be the site for the fourth Symposium and will feature as keynote speaker: Prof. Irving Biederman Departments of Psychology and Computer Science and the Neuroscience Program University of Southern California "Shape Representation in Mind and Brain" Submissions will be open to members of the Computational Neuroscience community of Southern California. Given the larger constituency than in previous years, authors are invited to contribute 300 word abstracts, which will be reviewed by a program committee consisting of the USC organizers and representatives of the INC. The contributed program will consist of 15 minute oral presentations, and posters. Abstracts selected for presentation will be included in the final program. DEADLINE FOR RECEIPT OF ABSTRACTS: March 28, 1997 Submissions should be e-mailed or mailed to Linda Yokote using the form below. Notification of acceptance as oral or poster presentation will be e-mailed to authors by April 18, 1997. A proceedings of short papers will be published by the INC. Contributions to the proceedings, based on both oral and poster presentations, must be submitted no later than May 30, 1997 for timely publication to: Institute for Neural Computation, University of California San Diego, 9500 Gilman Drive DEPT 0523, La Jolla, California 92093-0523. As in previous years, authors will retain copyright to their papers, so that they may be resubmitted elsewhere. Registration and attendance at the Symposium is open to the public. USC Organizing Committee: Dr. Bartlett Mel - Biomedical Engineering Department mel@quake.usc.edu, http://quake.usc.edu/lnc.html Dr. Michael Arbib - Professor of Computer Science and Neurobiology Director of the USC Brain Project arbib@pollux.usc.edu, http:/www-hbp.usc.edu/HBP ---------------------------------------------------------------------------- ---------------------------------------------------------------------------- 1997 JSNC Submission Form - Return to: Linda Yokote yokote@bmsrs.usc.edu (e-mail submissions preferred) US mail: Joint Symposium Biomedical Engineering Department USC, MC 1451 Los Angeles, CA 90089 (213)740-0840, (213)740-0343 fax ---------------------------------------------------------------------------- _____ I would like to attend the Symposium. Registration fee of $25 includes lunch and Proceedings. Checks are payable to the Department of Biomedical Engineering, USC. _____ I would like to give a presentation Title: _______________________________________________________________________ _______________________________________________________________________ Abstract: (300 word abstract goes here) _______________________________________________________________________ Speaker's Name: ______________________________________________________________ Affiliation/Department: ______________________________________________________ Address: _____________________________________________________________________ _____________________________________________________________________ _____________________________________________________________________ Telephone: ________________________ E-mail Address: _____________________ Others who should be listed as co-authors: ______________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ --- DEADLINE FOR RECEIPT OF ABSTRACTS: March 28, 1997 --- From rcs@cogsci.ed.ac.uk Fri Feb 28 03:33:59 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id DAA26976 for ; Fri, 28 Feb 1997 03:33:42 -0600 Received: from cs.uwa.oz.au (bilby.cs.uwa.oz.au [130.95.1.11]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id DAA21490 for ; Fri, 28 Feb 1997 03:33:37 -0600 (CST) Received: from (parma.cs.uwa.oz.au [130.95.1.7]) by cs.uwa.oz.au (8.6.8/8.5) with SMTP id OAA25073; Fri, 28 Feb 1997 14:56:19 +0800 Message-Id: <199702280656.OAA25073@cs.uwa.oz.au> From: rcs@cogsci.ed.ac.uk To: addistr@sis.port.ac.uk, Afcet.Administration@ibp.fr, ai-stats@watstat.uwaterloo.ca, alife@cognet.ucla.edu, annrules@fit.qut.edu.au, cogpsy@cogsci.soton.ac.uk, colt@cs.uiuc.edu, connectionists@cs.cmu.edu, dietmar@cognition.iig.uni-freiburg.de, genetic-programming@cs.stanford.edu, hybrid-list@cs.ua.edu, ml@ics.uci.edu, mlnet@csd.abdn.ac.uk, nat@ura1507.univ-paris13.fr, neuron-request@CATTELL.PSYCH.UPENN.EDU, nl-kr@snyside.sunnyside.com, nonlin-l@list.nih.gov, reinforce@cs.uwa.edu.au, scivw-request@hitl.washington.edu, www@sigart.acm.org, webmaster@ai.iit.nrc.ca, ernesto@dei.uc.pt, rcs@cogsci.ed.ac.uk Cc: rcs@cogsci.ed.ac.uk Subject: GALA '97 Language Acquisition conference, Edinburgh [connectionists] Date: Thu, 27 Feb 1997 12:53:58 GMT International Conference on Language Acquisition: Knowledge Representation and Processing GALA '97 4th - 6th April 1997 Edinburgh City Chambers ---- Plenary speakers ---- =20 Melissa Bowerman Nina Hyams Peter Jusczyk Steven Pinker Bonnie D. Schwartz Paul Smolensky =20 =20 The programme, below, consists of three full days containing talks by = the 6=20 invited speakers, 40 20-minute talks, and 80 posters. Any changes to = the programme will be posted as soon as possible to the conference web-page = at http://www.cogsci.ed.ac.uk/gala. The abstracts of the talks and posters = are=20 available on the web-pages.=20 =20 =20 Friday 4th April=20 8.30=AD9.30, Registration (continues throughout the day)=20 Council Chamber 9.30=AD10.00, Opening Remarks=20 10.00=AD11.00 Council Chamber Plenary Session Melissa Bowerman, Max Planck Institute for Psycholinguistics: Predicate semantics and lexicosyntactic development:=20 a crosslinguistic perspective 11.00=AD11.30, Break=20 11.30=AD12.00:=20 Council Chamber: David Lightfoot, Maryland: Catastrophes and "cue-based" learning=20 European Room Ocke-Schweren Bohn & Linda Polka, Aarhus and McGill: The nature and significance of perceptual biases in infant vowel = perception=20 12.00=AD12.30:=20 Council Chamber Susan Powers & Julien Musolino, Potsdam & Maryland: Precursor relative clauses in the acquisition of English=20 European Room Suzanne Curtin, Heather Goad, & Joe Pater, McGill, Southern California = and=20 British Columbia On the acquisition of Thai VOT contrasts by native speakers of English=20 12.30=AD2.00, Lunch=20 2.00=AD2.30=20 Council Chamber Gert Westermann, Edinburgh:=20 A constructivist neural network learns the past tense of English verbs=20 European Room Shanley Allen, Max Planck A discourse-pragmatic explanation for the subject-object asymmetry in = early null=20 arguments=20 2.30=AD3.00=20 Council Chamber Mark Seidenberg, Joseph Allen, & Morten Christiansen, Southern = California:=20 Language acquisition: learning and applying probabilistic constraints=20 European Room Lamya Abdul-Mareem, Thomas Roeper, & Jill deVilliers, UMass, UMass, & = Smith=20 College:=20 LF-Feature movement and negative islands in acquisition=20 3.00=AD3.30=20 Council Chamber Joseph Allen, Southern California:=20 Probabilistic constraints in acquisition=20 European Room Kenneth Drozd, Max Planck:=20 The syntax and semantics of "no" and negative constituents in child = English=20 3.30=AD4.00, Break=20 4.00=AD4.30=20 Council Chamber Boping Yuan, Cambridge:=20 An asymmetry of null subjects and null objects in Chinese speakers' L2 = English=20 European Room Matthew Saxton, Royal Holloway, London:=20 Corrective input in child language acquisition=20 4.30=AD5.00=20 Council Chamber Philippe Prevost, McGill=20 Subjects in second language acquisition: evidence for the truncation = hypothesis=20 European Room Lourdes de Leon, Reed College: Why verbs are learnt before nouns in Tzotzil (Mayan): the role of = caregiver=20 input and of verb-specific semantics=20 5.00=AD6.00, Plenary Session:=20 Council Chamber Bonnie D. Schwartz, University of Durham: The second language instinct 7.00=AD9.00, Reception=20 Saturday 5th April=20 9.00=AD10.00, Plenary Session: Council Chamber Nina Hyams, UCLA: Language development at the interface Council Chamber 10.00=AD10.30, Rita Manzini & Anna Roussou, UCL/Florence & Bangor:=20 Null arguments in early child grammars: a minimalist approach=20 European Room Laura Bosch & Nuria Sebasti=E1n, Barcelona:=20 Factors in native-language discrimination at four months: phonological = proximity=20 and bilingual exposure=20 10.30=AD11.00, Break=20 11.00=AD11.30=20 Council Chamber Linda Escobar, Sergio Baauw, & Bill Philip, Massachusetts at Boston & = Utrecht The wide scope interpretation of postverbal quantifier subjects: QR in = the early=20 grammar of Spanish=20 European Room C. Frenck-Mestre, Provence:=20 Examining second language reading: an on-line look=20 11.30=AD12.00=20 Council Chamber Silvina A. Montrul, McGill:=20 Transitivity alternations in Turkish as a second language=20 European Room Tamar Kaplan, Iowa:=20 General learning strategies and the process of L2 acquisition=20 12.00=AD2.00, Poster display (Mandela Room and Councillors' Lounge) and = Lunch=20 2.00=AD3.00, Plenary Session: Council Chamber Steven Pinker, MIT: Words and rules Council Chamber 3.00=AD3.30, Helen Goodluck, Ottawa:=20 Relative clauses in the speech of Adam=20 European Room S. Wauquier-Gravelines, C. Jakubowicz, P. Sauzet, C. Durand, & S. Franc, = CNRS=20 Paris 5, CNRS Paris 5, CNRS Paris 8, CNRS Paris 5, & Hospital R. Debray:=20 Phonological knowledge in developmental language disorders: "Liaison = encha=EEn=E9e"=20 and derivations in French=20 3.30=AD4.00=20 Council Chamber Cornelia Hamann & Kim Plunkett, Geneva & Oxford:=20 Subject omission in child Danish=20 European Room Sarah Barrett, Cambridge:=20 The prototype conversion model and its role in the developing system=20 4.00=AD4.30, Break=20 4.30=AD5.00=20 Council Chamber Arild Hestvik & Bill Philip, Bergen & Utrecht:=20 Chain condition errors and lexical feature acquisition in Norwegian = children=20 European Room Seyhun Topbas & Handan Kopkalli-Yavuz, Anadolu:=20 The onset of a linguistic system: is there evidence from the acquisition = of=20 final devoicing in Turkish?=20 5.00=AD6.00, Plenary Session: Council Chamber Paul Smolensky, Johns Hopkins: Universal grammar and learnability in Optimality Theory 8.30=AD1.00am, Ceilidh (Dancing!)=20 Sunday 6th April=20 9.00=AD10.00, Plenary Session: Council Chamber Peter Jusczyk, Johns Hopkins: Constraining the search for structure in the input 10.00=AD10.30=20 Council Chamber Michelle Aldridge, Robert D. Borsley, Susan Clack, & Gwennan Creunant, = Bob=20 Morris Jones, Bangor & Aberystwyth:=20 The acquisition of NPs in Welsh=20 European Room Elizabeth Purnell, Indiana:=20 11.00=AD11.30=20 Council Chamber David LeBlanc, Tilburg:=20 Gradualness effects in the acquisition of null subjects: arguments from = a theory=20 of economy of projection=20 European Room Stefanie Jannedy, Ohio State:=20 Development of "narrow focus" prosody competence 11.30=AD12.00=20 Council Chamber Jeannette Schaeffer, MIT:=20 On the acquisition of object placement in Dutch and Italian=20 European Room Conxita Lle=F3, Hamburg:=20 Filler syllables, proto-articles and early prosodic constraints in = Spanish and=20 German.=20 12.00=AD2.00, Poster display (Mandela Room and Councillors' Lounge) and = Lunch=20 2.00=AD2.30=20 Council Chamber Nigel Duffield, Philippe Pr=E9vost, & Lydia White, McGill:=20 Adult L2 knowledge of clitic placement: evidence from sentence matching=20 European Room Simon Kirby & James Hurford, Edinburgh:=20 The evolution of incremental learning=20 2.30=AD3.00=20 Council Chamber Liliana Sanchez, Carnegie Mellon:=20 Why do bilingual Spanish and Spanish in contact varieties drop definite = objects?=20 European Room Aude Billard & Gillian Hayes, Edinburgh:=20 "Allo Kazam, do you follow me?": Learning to speak through imitation for = social=20 robots=20 3.00=AD3.30=20 Council Chamber Marianne Starren, Max Planck:=20 =46rom scope adverbials to syntactic structure: the structural = organisation of=20 temporality in learner discourse=20 European Room Robert Clark, Edinburgh:=20 Language acquisition and implications for language change: a computer = model=20 3.30=AD4.00, Break=20 4.00=AD4.30=20 Council Chamber Alison Henry, John Wilson, and Cathy Finlay, Ulster at Jordanstown: Language acquisition and optionality=20 European Room Madelyn Kissock & Mark Hale, Harvard/Maine & Concordia: Nonphonological triggers for renewed access to "phonetic" perception 4.30=AD5.30, Plenary Session: Council Chamber Title to be announced=20 *************************************************************************= ***8 PROGRAMME OF POSTERS (Last updated, 26/2/97) =20 There will be no poster displays on Friday 4th April. Any changes to the programme will be posted as soon as possible. SATURDAY 5TH APRIL (Broadly covering topics in syntax in first and second language, = prosody,=20 modelling, bilingualism, semantics in second language, phonology in first language.)=20 (1) Anna Gavarro & Jaume Sola Word order alternations and feature assignment in bilingual Catalan = acquisition=20 (2) Lamya Abdul-Kareem & Thomas Roeper Ellipsis and economy of representation in acquisition=20 (3) Sergio Baauw, William Philip & M. Angeles Escobar A delayed principle B-effect in Spanish speaking children: the role of = lexical=20 feature acquisition=20 (4) Stefano Bertolo, Kevin Brohier, Edward Gibson & Kenneth Wexler Characterizing conditions for cue-based learning in parametric language = systems=20 (5) John Bullinaria Modelling the acquisition of reading skills=20 (6) Ronnie Cann Expressions, categories and the functional divide in first language = acquisition=20 (7) Marina Yueh-Ching Chen The acquisition of Ba-construction in Mandarin Chinese (first language=20 acquisition)=20 (8) Morten Christiansen & Joseph Allen Coping with variation in speech segmentation=20 (9) Bruno De Cara & D. Zagar Phonetic factors in reading acquisition: the role of sonority contrast = in=20 syllable processing=20 (10) Renato Donfrancesco Interjections and common ground in the development of mother-child = conversation=20 in Italian=20 (11) Mark T. Ellison Simplicity, psychological plausibility and connectionism in language = acquisition=20 (12) Kenneth Drozd Unexpected weak quantification in children's interpretations of = quantified=20 sentences=20 (13) Malcolm Finney Lexical, pragmatic and effects in interpreting pronominals in L2=20 (14) Stanka Fitneva Referential opacity as a structure dependent phenomenon=20 (15) Steven Gillis, Walter Daelemans, Gert Durieux & Masja Kempen Learning grammatical classes from phonological cues: an experiment=20 (16) Georgia Green Modelling Grammar Growth;Universal grammar without innate principles or=20 parameters=20 (17) Alison Henry Dialect variation and language acquisition=20 (18) Adam Huffman, John Locke & Sandra Whiteside Infant-caregiver interaction in Language Acquisition=20 (19) Tom Scutt & Oliver Rickard Hasta la Vista, Baby: 'Bilingual' and 'Second-Language' learning in a = recurrent=20 neural network trained on English and Spanish sentences=20 (20) Richard Shillcock The development of a speech segmentation strategy for English: an = example of an=20 emergent critical period effect=20 (21) Donna Lardiere Morphology as evidence for extended phrase structure in nonnative = grammar=20 (22) Yonata Levy & Anne Vainikka Null Subject in a 'mixed' language - evidence from the acquisition = Hebrew=20 (23) Marta Llebaria The sonority cycle in the acquisition of L2 phonologies=20 (24) Marta Lujan Doubling Structures and the acquisition of agreement=20 (25) Ivana Lyon Spontaneous production of past participles in Italian and = underspecification of=20 number=20 (26) Aida Martinovic-Zic Satellites as Local Trajectories: English and Serbo-Croatian=20 (27) Mayumi Masuko Eventuality and Temporality: Japanese learners' difficulty with English = verb=20 forms=20 (28) Gabriela Matos, Matilde Miguel, M. Joao Freitas & Isabel Hub Faria Functional categories in early acquisition of European Portuguese=20 (29) Ayumi Matsuo What children know about each other=20 (30) Kathleen McClure Evidence against mutual exclusivity=20 (31) Frank Wijnen, Masja Kempen & Steven Gillis Dutch children and their Mothers' infinitives=20 (32) Ramin Nakisa, Ulrike Hahn & Kim Plunkett The dual-route model of the English past tense: Another case where = defaults=20 don't help=20 (33) S. Parfitt Innate constraints are not necessary to overcome the projection problem, = or=20 Language acquisition as the learning of neural state trajectories=20 (34) Julian Pine, Elena V.M. Lieven & Caroline F. Rowland Comparing different models of grammatical development=20 (35) Teresa Satterfield The 'Shell Game': Why children never lose=20 (36) Ester Scarpa Learning external sandhi: evidence for a top-down hypothesis of prosodic = =20 acquisition=20 (37) Gabriele Scheler Transition from babbling to the one-word phase: A computational model=20 (38) Barbara Simpson An examination of data elicitation methods in research into L2 = pragmatics, and=20 their influence on the determination of pragmatic competence in = non-native speakers=20 (39) Ineke Van de Craats & Roeland Van Hout Turkish and Moroccan adults learning Dutch possession. A conservation = approach=20 of L2-acquisition=20 (40) Joost van de Weijer Language input to a prelinguistic child=20 (41) Angeliek van Hout Learning telicity: argument structure and the syntax/semantics of direct objects=20 (42) Laura Wagner Syntactic and semantic development of viewpoint aspect=20 SUNDAY 6TH APRIL (Broadly covering topics in bilingualism, syntax in first and second = language,=20 processing in second language, language impairment, phonology in first language.)=20 (1) Mary Flaherty Reading and deafness in Japan=20 (2) Sergey Avrutin On some similarities between children's and aphasics' linguistic errors=20 (3) Elpida Bairaktari, Stephen Crain & Spyridoula Varlokosta The acquisition of relative clauses in modern Greek: evidence for = movement=20 (4) Eva Bar-Shalom & William Snyder Root infinitives in Child Russian: A comparison with Italian=20 (5)Brett Berquist Individual differences in working memory span and proficiency: capacity = or=20 processing efficiency?=20 (6) Heather van der Lely Modularity and Innateness: Insight from a grammatical specific language=20 impairment=20 (7) Dongdong Chen L2 acquisition of the Zero Causative Morpheme=20 (8) Pascale Cole Grammatical processing of gender and phonological information in reading = words=20 at the end of Primary School=20 (9) Jenny Dalalakis Atypical morphological representations in Greek developmentally language = =20 impaired systems=20 (10) Susanne Dopke The development of negation as an example of the path to functional=20 differentiation in simultaneous bilingualism=20 (11) James Scobbie, William J. hardcastle, Fiona Gibbon & Paul Fletcher Covert contrasts in child speech=20 (12) Kayoko Enomoto Interlanguage phonology: Perceptual acquisition of Japanese durational = contrasts=20 by adult English-speaking learners of Japanese=20 (13) Paula Fikkert & Maria Joao Freitas Acquisition of syllable structure constraints: evidence from Dutch and=20 Portuguese=20 (14) Gaberell Drachman Telegraphic speech and constraint conflict in the acquisition of syntax=20 (15) Sonja Eisenbeiss Bootstrapping into the case system=20 (16) Patrizia Giuliano Some considerations on the acquisition of negative particles in French = L2 and=20 English L2=20 (17) Charles Grinstead Nominative case checking in child language=20 (18) Mark Hale & Charles Reiss What an OT parser tells us about the initial state of the grammar=20 (19) Cornelia Hamann & William Philip French clitics and the delay of Principle B=20 (20) Anita Heijkoop Underspecification and contrast consonant harmony in early phonological=20 acquisition=20 (21) Bart Hollebrandse The Split C-projection: evidence from the acquisition of Sequence of = Tense=20 (22) Sophie Hsia The role of articulatory memory in recall among second language learners = =20 (23) Aafke Hulk The acquisition of pronouns by a Dutch/French bilingual child: what can = we=20 learn?=20 (24) Celie Jakubowicz, Lea Nash & C. Gerard Acquisition of determiners and syntactic clitics by French-speaking = children=20 with developmental language disorders: Computation and vocabulary = insertion=20 (25) Sigal Uziel-Karl How General is Verb Argument Structure? The Development History of Early = Verbs=20 in Hebrew=20 (26) Mariko Kondo Acquisition of Japanese mora-timing by English speakers=20 (27) Zvi Penner & Manuela Schonenberger Normal and impaired acquisition of subordination patterns in German=20 (28) Jocelynne Watson & Nigel Hewlett Perceptual deficit in developmental phonological disorder=20 (29) Christina Schelletter & Imdra Sinka Emerging functional categories in bilingual children=20 (30) Usha Lakshmanan & Bryan Lindsey The Genitive of Negation and the position of NegP in the L2 acquisition = of=20 Russian=20 (31) Merce Prat-Sala & Richard Shillcock The role of animacy in production: a developmental approach=20 (32) Dorit Ravid Between syntax and the lexicon: the parallel between N-N compounds and = N-A=20 strings in acquisition=20 (33) Andreas Rohde The L2-tense hypothesis: lexical aspect and temporal relations in the=20 naturalistic L2 acquisition of verbal inflections (L1German/L2English)=20 (34) Tuula Savinainen-Makkonen How do the Finnish children produce three-syllable words of adult = language=20 (35) Petra Schulz How children understand complement clauses of factive and non-factive = verbs=20 (36) Sigriour Sigurjonsdottir Root infinitives and null subjects in early Icelandic=20 (37) Roumyana Slabakova Some Aspect-Related Constructions in English--a SM investigation=20 (38) Heike Tappe Language acquisition and Agenisis of the Corpus callosum (ACC)=20 (39) Claudio Zmarich, Serena Bonifacio & Franco Ferrero A phonetic and acoustic study of babbling in pretern Italian children=20 (40) Sharon Utakis Pronouns and definiteness in child language=20 *************************************************************************= ***** GENERAL INFORMATION AND REGISTRATION The conference will take place at Edinburgh City Chambers, located on the historic Royal Mile in the centre of Edinburgh. Information about accommodation in Edinburgh can be found at the GALA 97 www site: http://www.cogsci.ed.ac.uk/gala/ The Spring Meeting of the Linguistic Association of Great Britain will be held in Edinburgh immediately after GALA '97 (April 7th - 9th), in Pollock Halls, University of Edinburgh. Organisers: Antonella Sorace (Applied Linguistics), Caroline Heycock (Linguistics), Richard Shillcock (Cognitive Science). Address for email correspondence: gala97@ling.ed.ac.uk For further information, see: http://www.cogsci.ed.ac.uk/gala/ Appended below: (1) Registration form (for printing out and return by non-electronic means) _________________________________________________________________________= _ REGISTRATION FORM _________________________________________________________________________= _ GALA '97 April 4-6, 1997 University Of Edinburgh, Scotland Please print out and complete 1 form per delegate and return, together with your remittance to: GALA '97 Human Communication Research Centre The University of Edinburgh 2 Buccleuch Place Edinburgh EH8 9LW Scotland Please note that it may be possible to register on site, but places cannot be guaranteed, and that there is a discount for early registration (i.e. before 7th March) SECTION 1: PERSONAL DETAILS MS/MR/DR/PROF SURNAME FORENAME(S) (Please delete as appropriate) INSTITUTION: ADDRESS: TEL NO: FAX NO: EMAIL: SECTION 2: BOOKING DETAILS BEFORE AFTER 7/3/97 7/3/97 3 Days @ Full Delegate Rate 80.00 100.00 (pounds Sterling) 3 Days @ Student Rate 40.00 50.00 (to qualify, proof of student status (pounds Sterling) must accompany the registration). Total Cost of Registration ____ (Registration fee includes a drinks reception on the first evening) Lunch can be provided at a rate of 10.00 (pounds Sterling) per day. Please tick the days for which you require lunch: FRI ___ SAT ___ SUN ___ Total cost of lunches ____ TOTAL PAYABLE ____ SECTION 3: SPECIAL REQUIREMENTS 1. DIETARY REQUIREMENTS Please specify if you require a vegetarian or vegan option. 2. OTHER REQUIREMENTS Please specify any other special requirements eg. wheelchair access,etc. 3. ACCOMMODATION There is a wide range of accommodation available in the city, and we have negotiated some discounted rates, and also set aside a limited number of rooms at the University Halls of Residence. You can find detailed information on the gala www pages, URL - http://www.cogsci.ed.ac.uk/gala/ If you do not have access to the WWW, you may email us to request this information at gala97@ling.ed.ac.uk. You may also contact the Tourist Board, Central Accommodation Services, Tel: +44 131 557 9655 SECTION 4: PAYMENT OF FEES 1. Cheques should be drawn on a British bank in pounds Sterling, and made payable to "The University of Edinburgh". 2. Direct transfers should be sent to: Edinburgh University Account - 00919680 Bank Of Scotland 32a Chambers Street Edinburgh Scotland Sort Code 80-02-24 Transfers must include mention of "Gala A/c No: 265000 G40182" 3. Credit cards: We are able to accept payment from a variety of credit cards (Visa, Mastercard, Switch, and Delta). Payments by credit card will incur an additional charge of 2% of the total amount due. If you wish to pay by credit card, please complete the following, and send it to us by post (Do not email your card number. we need an original signature). NAME: ADDRESS (to which card is registered): CARD TYPE: (VISA, MC, SWITCH, DELTA): CARD NUMBER: VALID FROM: EXPIRY DATE: AMOUNT DUE: (in pounds Sterling) ADMINISTRATION CHARGE (2%) (in pounds Sterling) _________________________________________________________________________= _ ------------- End Forwarded Message ------------- From ataxr@IMAP1.ASU.EDU Fri Feb 28 04:11:35 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id EAA27374 for ; Fri, 28 Feb 1997 04:11:30 -0600 Received: from cs.uwa.oz.au (bilby.cs.uwa.oz.au [130.95.1.11]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id EAA21850 for ; Fri, 28 Feb 1997 04:11:25 -0600 (CST) Received: from (parma.cs.uwa.oz.au [130.95.1.7]) by cs.uwa.oz.au (8.6.8/8.5) with SMTP id OAA25117; Fri, 28 Feb 1997 14:59:21 +0800 Message-Id: <199702280659.OAA25117@cs.uwa.oz.au> From: Asim Roy To: cneuro@bbb.caltech.edu, cogpsy@cogsci.soton.ac.uk, alife@cognet.ucla.edu, annrules@fit.qut.edu.au, colt@cs.uiuc.edu, genetic-programming@cs.stanford.edu, gann-list@cs.iastate.edu, hybrid-list@cs.ua.edu, ml@ics.uci.edu, mlnet@csd.abdn.ac.uk, neuron-request@CATTELL.PSYCH.UPENN.EDU, nonlin-l@list.nih.gov, reinforce@cs.uwa.edu.au, psyc@pucc.princeton.edu Cc: asim.roy@asu.edu Subject: Does plasticity imply local learning? And other questions Date: Thu, 27 Feb 1997 23:20:31 -0500 (EST) I thought it might be productive to the discussion if I compiled and posted the responses I have received so far. I am posting them without any comments. I hope this will promote further discussion on this topic. The original posting is attached below for reference. ============================================================ Response # 1: Re: your note on plasticity and learning Asim, As you mention, neuroscience tends to equate network plasticity with learning. Connectionists tend to do the same. However this raises a problem with biological systems because this conflates the processes of development and learning. Even the smartest organism starts from an egg, and develops for its entire lifespan - how do we distinguish which changes are learnt, and which are due to development. No one would argue that we *learn* to have a cortex, for instance, even though it is due to massive emryological changes in the central nervous system of the animal. This isn't a problem with artificial nets, because they do not usually have a true developmental process and so there can be no confusion between the two; but it has been a long-standing problem in the ethology literature, where learnt changes are contrasted with "innate" developmental ones. A very interesting recent contribution to this debate is Andre Ariew's "Innateness and Canalization", in Philosophy of Science 63 (Proceedings), in which he identifies non-learnt changes as being due to canalised processes. Canalization was a concept developed by the biologist Waddington in the 40's to describe how many changes seem to have fixed end-goals that are robust against changes in the environment. The relationship between development and learning was also thoroughly explored by Vygotsky (see collected works vol 1, pages 194-210). I'd like to see what other sorts of responses you get, Joe Faith Evolutionary and Adaptive Systems Group, School of Cognitive and Computing Sciences, University of Sussex, UK. ================================================================= Response # 2: I fully agree with you, that local learning is not the one and only ultimate approach - even though it results in very good learning for some domains. I am currently writing a paper on the competitive learning paradigm. I am proposing, that this competition that occurs e.g. within neurons should be called local competition. The network as a whole gives a global common goal to these local competitors and thus their competition must be regarded as cooperation from a more global point of view. There is a nice paper by Kenton Lynne that integrates the ideas of reinforcement and competition. When external evaluations are present, they can serve as teaching values, if nor the neurons compete locally. @InProceedings{Lynne88, author = {K.J.\ Lynne}, title = {Competitive Reinforcement Learning}, booktitle = {Proceedings of the 5th International Conference on Machine Learning}, year = {1988}, publisher = {Morgan Kaufmann}, pages = {188--199} } Best regards, Christoph Herrmann ---------------------------------------------------------- Christoph Herrmann Visiting researcher Hokkaido University Meme Media Laboratory Kita 13 Nishi 8, Kita- Tel: +81 - 11 - 706 - 7253 Sapporo 060 Fax: +81 - 11 - 706 - 7808 Japan Email: chris@meme.hokudai.ac.jp http://aida.intellektik.informatik.th-darmstadt.de/~chris/ ---------------------------------------------------------- ============================================================= Response #3: I've just read your list of questions on local vs. global learning mechanisms. I think I'm sympathatic to the implications or presuppositions of your questions but need to read them more carefully later. Meanwhile, you might find very interesting a two-part article on such a mechanism by Peter G. Burton in the 1990 volume of _Psychobiology_ 18(2).119-161 & 162-194. Steve Chandler =============================================================== Response #4: A few years back, I wrote a review article on issues of local versus global learning w.r.t. synaptic plasticity. (Unfortunately, it has been "in press" for nearly 4 years). Below is an abstract. I can email the paper to you in TeX or postscript format, or mail you a copy, if you're interested. Russell Anderson ------------------------------------------------ "Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error" (In press: Progress in Neural Networks) Russell W. Anderson Smith-Kettlewell Eye Research Institute 2232 Webster Street San Francisco, CA 94115 Office: (415) 561-1715 FAX: (415) 561-1610 anderson@skivs.ski.org Abstract: Neural network models offer a theoretical testbed for the study of learning at the cellular level. The only experimentally verified learning rule, Hebb's rule, is extremely limited in its ability to train networks to perform complex tasks. An identified cellular mechanism responsible for Hebbian-type long-term potentiation, the NMDA receptor, is highly versatile. Its function and efficacy are modulated by a wide variety of compounds and conditions and are likely to be directed by non-local phenomena. Furthermore, it has been demonstrated that NMDA receptors are not essential for some types of learning. We have shown that another neural network learning rule, the chemotaxis algorithm, is theoretically much more powerful than Hebb's rule and is consistent with experimental data. A biased random-walk in synaptic weight space is a learning rule immanent in nervous activity and may account for some types of learning -- notably the acquisition of skilled movement. ------------------------------------------ E-mail: send request to rwa@milo.berkeley.edu Slow mail: Russell Anderson 2415 College Ave. #33 Berkeley, CA 94704 ========================================================== Response #5: Asim Roy typed ... > > B) "Pure" local learning does not explain a number of other > activities that are part of the process of learning!! ... > > So, in the whole, there are a "number of activities" that need to > be > performed before any kind of "local learning" can take place. These > aforementioned learning activities "cannot" be performed by a > collection of "local learning" cells! There is more to the process > of learning than simple local learning by individual cells. Many > learning "decisions/tasks" must precede actual training by "local > learners." A group of independent "local learners" simply cannot > start learning and be able to reproduce the learning > characteristics and processes of an "autonomous system" like the > brain. I cannot see how you can prove the above statement (particularly the last sentence). Do you have any proof. By analogy, consider many insect colonies (bees, ants etc). No-one could claim that one of the insects has a global view of what should happen in the colony. Each insect has its own purpose and goes about that purpose without knowing the global purpose of the colony. Yet an ants nest does get built, and the colony does survive. Similarly, it is difficult to claim that evolution has a master plan, order just seems to develop out of chaos. I am not claiming that one type of learning (local or global) is better than another, but I would like to see some evidence for your somewhat outrageous claims. > Note that the global learning mechanism may actually be implemented > with a collection of local learners!! You seem to contradict yourself here. You first say that local learning cannot cope with many problems of learning, yet global learning can. You then say that global learning can be implemented using local learners. This is like saying that you can implement things in C, that cannot be implemented in assembly!! It may be more convenient to implement it in C (or using global learning), but that doesn't make it impossible for assembly. Cheers, Brendan. ------------------------------------------------------------------- Brendan McCane, PhD. Email: mccane@cs.otago.ac.nz Comp.Sci. Dept., Otago University, Phone: +64 3 479 8588. Box 56, Dunedin, New Zealand. There's only one catch - Catch 22. =============================================================== Response #6: In regards to arguments against global learning: I think no one seriously questions this possibility, but think that global learning theories are currently non-verifiable/ non-falsifyable. Part of the point of my paper was that there ARE ways to investigate non-local learning, but it requires changes in current experimental protocols. Anyway, good luck. I look forward to seeing your compilation. Russell ------------------------------------------ E-mail: send request to rwa@milo.berkeley.edu Slow mail: Russell Anderson 2415 College Ave. #33 Berkeley, CA 94704 ============================================================== Response #7: I am sorry that it has taken so long for me to reply to your inquiry about plasticity and local/global learning. As I mentioned in my first note to you, I am sympathetic to the view that learning involves some sort of overarching, global mechanism even though the actual information storage may consist of distributed patterns of local information. Because I am sympathetic to such a view, it makes it very difficult for me to try to imagine and anticipate the problems for such views. That's why I am glad to see that you are explicitly trying to find people to point out possible problems; we need the reality check. The Peter Burton articles that I have sent you describes exactly the kind of mechanism implied by your first question: Does plasticity imply local learning? Burton describes a neurological mechanism by which local learning could emerge from a global signal. Essentially he posits that whenever the new perceptual input being attended to at any given moment differs sufficiently from the record of previously recorded experiences to which that new input is being compared, the difference triggers a global "proceed-to-store" signal. This signal creates a neural "snapshot" (my term, not Burton's) of the cortical activations at that moment, a global episodic memory (subject to stimulus sampling effects, etc.). Burton goes on to describe how discrete episodic memories could become associated with one another so as to give rise to schematic representations of percepts (personally I don't think that positing this abstraction step is necessary, but Burton does it). As neuroscientists sometimes note, while it is widely assumed that LTP/LTD are local learning mechanisms, the direct evidence for such a hypothesis is pretty slim at best. Of course of of the most serious problems with that view is that the changes don't last very long and thus are not really good candidates for long term (i.e., life long) memory. Now, to my mind, one of the most important possibilities overlooked in LTP studies (inherently so in all in vitro preparations and so far as I know--which is not very far because this is not my field--in the in vivo preparations that I have read about) is that LTP/D is either an artifact of the experiment or some sort of short term change which requires a global signal to become consolidated into a long term record. Burton describes one such possible mechanism. Another motivation for some sort of global mechanism comes from the so-called 'binding problem' addressed especially by the Damasio's, but others too. Somehow somewhere all the distributed pieces of information about what an orange is, for example, have to be tied together. A number of studies of different sorts have demonstarted repeatedly that such information is distributed throughout cortical areas. Burton distinguishes between "perceptual learning" requiring no external teacher (either locally or globally) and "conceptual learning", which may require the assistance of a 'teacher'. In his model though, both types of learning are activated by global "proceed-to-learn" signals triggered in turn by the global summation of local disparities between remembered episodes and current input. I'll just mention in closing that I am particularly interested in the empirical adequacy of neuropsychological accounts such as Burton's because I am very interested in "instance-based" or "exemplar-based" models of learning. In particular, Royal Skousen's _Analogical Modeling of Language_ (Kluwer, 1989) describes an explicit, mathematical model for predicting new behavior on analogy to instances stored in long term memory. Burton's model suggests a possible neurological basis for such behavior. ============================================================== Response #8: ******************************************************************* Fred Wolf E-Mail: fred@chaos.uni-frankfurt.de Institut fuer Theor. Physik Robert-Mayer-Str. 8 Tel: 069/798-23674 D-60 054 Frankfurt/Main 11 Fax: (49) 69/798-28354 Germany ******************************************************************* Dear Asim Roy, could you please point me to a few neuroBIOLOGICAL references that justify your claim that > > A predominant belief in neuroscience is that synaptic plasticity > and LTP/LTD imply local learning (in your sens). > I think many people appreciate that real learning implies the concerted interplay of a lot of different brain systems and should not even be attempted to be explained by "isolated local learners". See e.g. the series of review-papers on memory in a recent volume of PNAS 93 (1996) (http://www.pnas.org/). Good luck with your general theory of global/local learning. best wishes Fred Wolf ============================================================== Response #9: Comp-Neuro Mailing List wrote: > > =================================================================== > Date: Sun, 16 Feb 1997 22:57:11 -0500 (EST) > From: Asim Roy > Subject: Does plasticity imply local learning? And other questions > > Any comments on these ideas and possibilities are welcome. > > Asim Roy > Arizona State University I am into neurocomputing for several years. I read your arguments with interest. They certainly deserve further attention. Perhaps some combination of global-local learning agents would be the right choice. - Vassilis G. Kaburlasos Aristotle University of Thessaloniki, Greece ============================================================== =============================================================== Original Memo: A predominant belief in neuroscience is that synaptic plasticity and LTP/LTD imply local learning. It is a possibility, but it is not the only possibility. Here are some thoughts on some of the other possibilities (e.g. global learning mechanisms or a combination of global/local mechanisms) and some discussion on the problems associated with "pure" local learning. The local learning idea is a very core idea that drives research in a number of different fields. I welcome comments on the questions and issues raised here. This note is being sent to many listserves. I will collect all of the responses from different sources and redistribute them to all of the participating listserves. The last such discussion was very productive. It has led to the realization by some key researchers in the connectionist area that "memoryless" learning perhaps is not a very "valid" idea. That recognition by itself will lead to more robust and reliable learning algorithms in the future. Perhaps a more active debate on the local learning issue will help us resolve this issue too. A) Does plasticity imply local learning? The physical changes that are observed in synapses/cells in experimental neuroscience when some kind of external stimuli is applied to the cells may not result at all from any specific "learning" at the cells. The cells might simply be responding to a "signal to change" - that is, to change by a specific amount in a specific direction. In animal brains, it is possible that the "actual" learning occurs in some other part(s) of the brain, say perhaps by a global learning mechanism. This global mechanism can then send "change signals" to the various cells it is using to learn a specific task. So it is possible that in these neuroscience experiments, the external stimuli generates signals for change similar to those of a global learning agent in the brain and that the changes are not due to "learning" at the cells themselves. Please note that scientific facts and phenomenon like LTP/LTD or synaptic plasticity can probably be explained equally well by many theories of learning (e.g. local learning vs. global learning, etc.). However, the correctness of an explanation would have to be judged from its consistency with other behavioral and biological facts, not just "one single" biological phenomemon or fact. B) "Pure" local learning does not explain a number of other "activities" that are part of the process of learning!! When learning is to take place by means of "local learning" in a network of cells, the network has to be designed prior to its training. Setting up the net before "local" learning can proceed implies that an external mechanism is involved in this part of the learning process. This "design" part of learning precedes actual training or learning by a collection of "local learners" whose only knowledge about anything is limited to the local learning law to use! In addition, these "local learners" may have to be told what type of local learning law to use, given that a variety of different types can be used under different circumstances. Imagine who is to "instruct and set up" such local learners which type of learning law to use? In addition to these, the "passing" of appropriate information to the appropriate set of cells also has to be "coordinated" by some external or global learning mechanism. This coordination cannot just happen by itself, like magic. It has to be directed from some place by some agent or mechanism. In order to learn properly and quickly, humans generally collect and store relevant information in their brains and then "think" about it (e.g. what problem features are relevant, complexity of the problem, etc.). So prior to any "local learning," there must be processes in the brain that "examine" this "body of information/facts" about a problem in order to design the appropriate network that would fit the problem complexity, select the problem features that are meaningful, etc. It would be very difficult to answer the questions "What size net?" and "What features to use?" without looking at the problem (body of information)in great detail. A bunch of "pure" local learners, armed with their local learning laws, would have no clue to these issues of net design, generalization and feature selection. So, in the whole, there are a "number of activities" that need to be performed before any kind of "local learning" can take place. These aforementioned learning activities "cannot" be performed by a collection of "local learning" cells! There is more to the process of learning than simple local learning by individual cells. Many learning "decisions/tasks" must precede actual training by "local learners." A group of independent "local learners" simply cannot start learning and be able to reproduce the learning characteristics and processes of an "autonomous system" like the brain. Local learning or local computation, however, is still a feasible idea, but only within a general global learning context. A global learning mechanism would be the one that "guides" and "exploits" these local learners or computational elements. However, it is also possible that the global mechanism actually does all of the computations (learning) and "simply sends signals" to the network of cells for appropriate synaptic adjustment. Both of these possibilities seem logical: (a) a "pure" global mechanism that learns by itself and then sends signals to the cells to adjust, or (b) a global/local combination where the global mechanism performs certain tasks and then uses the local mechanism for training/learning. Thus note that the global learning mechanism may actually be implemented with a collection of local learners or computational elements!! However, certain "learning decisions" are made in the global sense and not by "pure" local learners. The basic argument being made here is that there are many tasks in a "learning process" and that a set of "local learners" armed with their local learning laws is incapable of performing all of those tasks. So local learning can only exist in the context of global learning and thus is only "a part" of the total learning process. It will be much easier to develop a consistent learning theory using the global/local idea. The global/local idea perhaps will also give us a better handle on the processes that we call "developmental" and "evolutionary." And it will, perhaps, allow us to better explain many of the puzzles and inconsistencies in our current body of discoveries about the brain. And, not the least, it will help us construct far better algorithms by removing the "unwarranted restrictions" imposed on us by the current ideas. Any comments on these ideas and possibilities are welcome. Asim Roy Arizona State University From movellan@ergo.ucsd.edu Fri Feb 28 07:26:44 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id HAA29018 for ; Fri, 28 Feb 1997 07:26:39 -0600 Received: from TELNET-1.SRV.CS.CMU.EDU (TELNET-1.SRV.CS.CMU.EDU [128.2.254.108]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id HAA24193 for ; Fri, 28 Feb 1997 07:26:38 -0600 (CST) Received: from TELNET-1.SRV.CS.CMU.EDU by telnet-1.srv.cs.CMU.EDU id aa11994; 28 Feb 97 2:28:14 EST Received: from DST.BOLTZ.CS.CMU.EDU by TELNET-1.SRV.CS.CMU.EDU id aa11992; 28 Feb 97 2:19:51 EST Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa12332; 28 Feb 97 2:19:14 EST Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id ad09895; 27 Feb 97 20:02:19 EST Received: from ergo.ucsd.edu by EDRC.CMU.EDU id aa19481; 27 Feb 97 19:35:12 EST Received: (from movellan@localhost) by ergo.ucsd.edu (8.7.6/8.7.3) id QAA29932; Thu, 27 Feb 1997 16:31:44 -0800 Date: Thu, 27 Feb 1997 16:31:44 -0800 Message-Id: <199702280031.QAA29932@ergo.ucsd.edu> From: "Javier R. Movellan" To: connectionists@cs.cmu.edu Subject: TR Announcement The following technical report is available online at http://cogsci.ucsd.edu (follow links to Tech Reports & Software ) Physical copies are also available (see the site for information). From Contour Completion to Image Schemas: A Modern Perspective on Gestalt Psychology. Adrian Robert (Communicated by Martin I. Sereno) Department of Cognitive Science University of California San Diego The Gestalt approach to psychology represents an early but comprehensive and systematic attempt to relate psychological and neural functioning. When the approach was first formulated and actively researched, however, too little was known about brain function to forge a precise and direct connection. As a result, the approach never fulfilled its initial promise of a rigorously founded psychology grounded in physical science and has fallen out of the favor and attention of most contemporary students of the mind. In this paper we re-examine Gestalt psychology with reference to what is currently known of dynamic mechanisms of brain function, particularly by exploring plausible neural substrates of perceptual grouping. We suggest, based on this examination, that although many of the details of the Gestalt proposals are in need of revision, the approach remains fundamentally viable, and the elegant character of its grounding and systematicity make it a valuable framework for organizing present knowledge at both neural and functional levels. From gerda@ai.univie.ac.at Fri Feb 28 13:02:24 1997 Received: from lucy.cs.wisc.edu (lucy.cs.wisc.edu [128.105.2.11]) by sea.cs.wisc.edu (8.6.12/8.6.12) with ESMTP id NAA03922 for ; Fri, 28 Feb 1997 13:02:12 -0600 Received: from TELNET-1.SRV.CS.CMU.EDU (TELNET-1.SRV.CS.CMU.EDU [128.2.254.108]) by lucy.cs.wisc.edu (8.7.6/8.7.3) with SMTP id NAA01474 for ; Fri, 28 Feb 1997 13:02:10 -0600 (CST) Received: from TELNET-1.SRV.CS.CMU.EDU by telnet-1.srv.cs.CMU.EDU id aa12707; 28 Feb 97 10:30:34 EST Received: from DST.BOLTZ.CS.CMU.EDU by TELNET-1.SRV.CS.CMU.EDU id aa12705; 28 Feb 97 10:18:25 EST Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa12905; 28 Feb 97 10:18:09 EST Received: from RI.CMU.EDU by B.GP.CS.CMU.EDU id aa17599; 28 Feb 97 4:03:07 EST Received: from [192.76.244.69] by RI.CMU.EDU id aa29170; 28 Feb 97 4:01:23 EST Received: from lux (lux.ai.univie.ac.at [192.76.244.82]) by dublin.ai.univie.ac.at (8.8.4/8.8.4) with SMTP id KAA28673; Fri, 28 Feb 1997 10:00:58 +0100 (MET) Sender: gerda@ai.univie.ac.at Message-ID: <33169ECA.2781E494@ai.univie.ac.at> Date: Fri, 28 Feb 1997 10:00:58 +0100 From: Gerda Helscher Organization: Dept.Medical Cybernetics and Artificial Intelligence, Univ.Vienna, Austria X-Mailer: Mozilla 3.01Gold (X11; I; SunOS 4.1.4 sun4m) MIME-Version: 1.0 To: connectionists@cs.cmu.edu Subject: New Book: NN and a new AI Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit !!! New Book Announcement !!! ================================================= Neural Networks and a New Artificial Intelligence ================================================= edited by Georg Dorffner International Thomson Computer Press, London, 1997 ISBN 1-85032-172-8 About the book: =============== Since the re-birth of interest in artificial neural networks in the mid 1980s, they have become a much-discussed topic, particularly in terms of their real contribution to the explanation and modelling of cognition, as part of the field of artificial intelligence (AI). This edited collection brings together a selection of papers from experts in their field, outlining the concrete contribution that neural computing has made to AI. "Neural Networks and a New Artificial Intelligence" is a collection of arguments, examples and critical elaborations from different views on how and whether neural networks can not only contribute to a better artificial intelligence, but can also revolutionise it by forming the basis for a truly alternative paradigm. Contents: ========= Introduction Part I: General topics - new AI as a whole New AI: naturalness revealed in the study of artificial intelligence (by Erich Prem, Vienna) Representational eclecticism - a foundation stone for the new AI? (by Chris Thornton, Brighton) Part II: concrete approaches and research strategies Towards a connectionist model of action sequences, active vision and breakdowns (by Hugues Bersini, Brussels) Complete autonomous systems: a research strategy for cognitive science (by Rolf Pfeifer and Paul Verschure, Zurich) Radical connectionism - a neural bottom-up approach to AI (by Georg Dorffner, Vienna) Connectionist explanation: taking position in the mind-brain dilemma (by Paul Verschure, Zurich) On growing intelligence (by Jari Vaario, Kyoto, and Setsuo Ohsuga, Tokyo) Part III: Issues in modelling high-level cognition Systematicity and generalization in compositional connectionist representations (by Lars Niklasson, Sk"ovde, and Noel Sharkey, Sheffield) Constructive learning in connectionist semantic networks (by Joachim Diederich and James Hogan, Brisbane) A connectionist Model for the interpretation of metaphors (by Stefan Wermter, Hamburg, and Ruth Hannuschka, Dortmund) Some issues in neural cognitive modelling (by Max Garzon, Memphis) Neural networks and a new AI - questions and answers (with contributions from all the authors) Index