From pazzani@ics.uci.edu Sun Apr 30 14:44:10 1995 Received: from lucy.cs.wisc.edu by sea.cs.wisc.edu; Sun, 30 Apr 95 14:44:05 -0500; AA21375 Received: from ics.uci.edu by lucy.cs.wisc.edu; Sun, 30 Apr 95 14:43:56 -0500 Received: from ics.uci.edu by q2.ics.uci.edu id aa28817; 30 Apr 95 10:24 PDT To: ML-LIST:; Subject: Machine Learning List: Vol. 7, No. 8 Reply-To: ml@ics.uci.edu Date: Sun, 30 Apr 1995 09:58:35 -0700 From: Michael Pazzani Message-Id: <9504301024.aa28817@q2.ics.uci.edu> Machine Learning List: Vol. 7, No. 8 Saturday, April 29, 1995 Contents: Machine Learning List: Vol. 7, No. 8 Classes from Data! Autoclass C is here Announce - Synthetic Classification Data Sets program KLML workshop at ECML-95 AAAI Fall Symposium on Rational Agency CFP: special issue of Evolutionary Computation NNCM-95 Neural Networks short course The Machine Learning List is moderated. Contributions should be relevant to the scientific study of machine learning. Mail contributions to ml@ics.uci.edu. Mail requests to be added or deleted to ml-request@ics.uci.edu. Back issues may be FTP'd from ics.uci.edu in pub/ml-list/V/ or N.Z where X and N are the volume and number of the issue; ID: anonymous PASSWORD: URL- http://www.ics.uci.edu/AI/ML/Machine-Learning.html ---------------------------------------------------------------------- Subject: Classes from Data! Autoclass C is here Date: Wed, 19 Apr 95 10:06:37 PDT From: Will Taylor Announcing AUTOCLASS C, the mostest Bayesian Classifier ever! You describe a bunch of cases with as many attributes as you like, and before you can factor a 500 digit number, out pops a set of classes defined by a small set of class parameters. FEATURES: * Finds the optimum number of classes automatically. * Can deal with real-valued, discrete, or missing values. * Can leap tall buildings in a single bound. (scratch that, wrong ad) * Class/Case assignments are probabilistic, not just either/or. * Indicates which attributes are most influential for which classes. * Described in respectable publications (references provided). * You can stop the search and get the current best answer at any time. * Estimates rate of progress in search, to aid deciding when to quit. * Fully Bayesian - uses priors and finds max posterior classification. How much would you pay for all this? Don't answer! There's more. If you act now you also get: * In ANSI C, with source code as standard equipment. * Organically grown and bug-free (we call them features). * Our cheerful support staff will answer all your calls 3:00-3:01am. * Invented by NASA, the guys who gave you the moon. * Programmed by academics: Diane Cook and Joe Potts at UTexas-Arlington. * Price: FREE! Or make an offer! * Available via anonymous ftp. Questions? Comments? Want the Movie Rights? Contact us at: http://ic-www.arc.nasa.gov: /ic/projects/bayes-group/group/html/autoclass-c-program.html or send e-mail to taylor@ptolemy.arc.nasa.gov The Bayes Boys, Peter Cheeseman, John Stutz, Robin Hanson, Will Taylor (No No, REALLY! We're Serious!) ------------------------------ Subject: Announce - Synthetic Classification Data Sets program Date: Mon, 10 Apr 95 14:19:20 PDT From: Gabor_Melli@cs.sfu.ca One important way to test learning-from-example algorithms is to evaluate their performance against well understood synthetic data sets. The Synthetic Classification Data Sets (SCDS) program has been created to generate synthetic data sets which are particularly useful to test Knowledge Discovery from Database (KDD) algorithms. SCDS will first generate a synthetic set of rules and proceed to generate a relation which abides by this rule base. Several characteristics of the process can be customized. Rule bases may be in Conjunctive Normal Form, with easy customization of the quantity and complexity of the rules. Data sets can also be customized, for example, to include some interesting real-world characteristics such as irrelevant attributes, missing attributes, noisy data and missing values. The project files and information are located at: http://fas.sfu.ca/cs/people/GradStudents/melli/SCDS While the ANSI C source code for version 1.0 is available, you are encouraged to test the user-friendly interactive WWW Form interface! Gabor Melli School of Computing Science Simon Fraser University ------------------------------ Subject: KLML workshop at ECML-95 Date: Mon, 10 Apr 1995 15:14:42 +0200 From: Dieter Fensel MLnet Sponsored Familiarization Workshop Knowledge Level Modelling and Machine Learning Heraklion, Crete, Greece April 28-29, 1995 Workshop Program Friday 28 April, 1995 ===================== 09:30 - 10:30 Invited talk: Knowledge-level Modelling and the New Knowledge Enineering Cycle. A. Th. Schreiber 14:00 - 16:00 Session I: Applying Machine Learning in a Modelling Framework. session chair: Dieter Fensel How to Do Things with Data: Exploiting Knowledge-Level Redescriptions. James Cupit and Nigel Shadbolt Learning Control Knowledge in Models of Expertise. Remco Straatman Interactive Refinement of Structured Knowledge-Based Systems. Eveline M. Helsper and Maarten W. van Someren Object-oriented Data Modelling and Rules: ILP Meets Databases. Lubos Polelinsky 16:30 - 18:30 Session II: Applying a Modelling Framework to Machine Learning. session chair: Celine Rouveirol Modeling Identification Strategies Using the MCC Methodology. Karine Chausse and Jacques Lebbe Automated Model Selection. D. Mladenic Saterday 29 April, 1995 ======================= 09:00 - 10:45 Session III: General Issues on Integrating Knowledge Level Modelling and Machine Learning. session chair: Claire Nedellec Beyond the Knowledge Level: Behavior Descriptions of Machine Learning Systems. Franz Schmalhofer and J. Stuart Aitken Towards a Knowledge Level Formalization of Learning. Denis Pierre and Joel Quinqueton Multi-Level Concept Modelling for Similarity-Based Learning. Engelbert Mephu Nguifo 11:15 - 13:00. Session IV: Summary and Future Activities. session chair: Enric Plaza i Cervera The proceedings of the workshops are available under anonymous ftp: ftp swi.psy.uva.nl pub/papers/fensel/klml. Further question can be directed to Dieter Fensel (dieter@swi.psy.uva.nl) ------------------------------ Subject: AAAI Fall Symposium on Rational Agency Date: Tue, 11 Apr 1995 18:06:53 -0700 From: Michael Fehling CALL FOR PARTICIPATION -- AAAI FALL SYMPOSIUM Rational Agency: Concepts, Theories, Models, and Applications November 10-12, 1995 Massachusetts Institute of Technology Cambridge, Massachusetts EXTENDED SUBMISSION DEADLINE: May 3, 1995 1. Description This symposium explores conceptions of rational agency and their implications for theory, research, and practice. The view that intelligent systems are, or ought to be, rational agents underlies a good deal of the theory and research in artificial intelligence and related disciplines. However, consensus has yet to be reached on a proper view of agency and rationality principles for practical agents. Traditionally agents are presumed disposed toward purposive, goal-directed actions. However, alternative theories abound in which agent behavior is portrayed as fundamentally reactive. Some theories focus on agents' abilities to embody and manipulate private systems of belief. Other theories depict agents' abilities to interact overtly with their environment and sometimes with other agents. Application builders have recently broadened the term "agent" as referring to any embedded system performing tasks in support of human users. Rationality accounts are equally diverse. Rationality involves having reasons that warrant particular beliefs (epistemic rationality) or particular desires and actions (strategic or practical rationality). Many models of agency embrace such logical standards as consistency, consequential closure, or soundness for epistemic rationality. Other agent models depict practical rationality as based on some classical or non-monotonic logic for reasoning about action. Such logicist views of rational agency are now being challenged by decision theoretic models emphasizing optimal choice of actions under uncertainty. This includes recent work on decision theoretic principles of limited rationality. We invite researchers with an active interest in the concept of rational agency. Relying upon a mixture of informal presentation and group discussion, participants will critically examine agency concepts and rationality principles, review computational agent models and applications, and explore opportunities for future work on this topic. 2. Submission Information Those interested in participating should submit a concept paper of no more than five pages describing their work in relation to any of the following questions: * Is rationality important; must an agent be rational to be successful? * What are suitable principles of epistemic, strategic, or limited rationality? * Are rationality principles applicable to retrospective processes such as learning? * What, if any, are the general requirements on rational agent architectures? * How, if at all, must a model of rational agency be modified to account for social, multi-agent interaction? Those wishing to make a specific presentation should outline its contents in their concept paper. NOTE: While we recognize that our topic lends itself to formal analysis, we also encourage discussion of experimental work with implemented agents. EXTENDED SUBMISSION DEADLINE: The deadline for receipt of concept papers has been extended to Wednesday, May 3, 1995. Concept papers must be emailed to the committee chair, Michael Fehling. His email address is fehling@lis.stanford.edu. Please send your paper as either plain ascii text or as a postscript file. 3. Program Committee Michael Fehling, Chair Don Perlis Laboratory for Intelligent Systems Dept. of Computer Science Stanford University University of Maryland Martha Pollack John Pollock Dept. of Computer Science Philosophy Dept. University of Pittsburgh University of Arizona ------------------------------ Date: Thu, 13 Apr 1995 09:06:24 +0500 From: Peter Turney Subject: CFP: special issue of Evolutionary Computation Call for Papers EVOLUTIONARY COMPUTATION Special Issue on EVOLUTION, LEARNING, AND INSTINCT: 100 YEARS OF THE BALDWIN EFFECT In 1896, James Mark Baldwin proposed that individual learning can explain evolutionary phenomena that appear to require Lamarckian inheritance of acquired characteristics. The ability of individuals to learn can guide the evolutionary process. In effect, learning smoothes the fitness landscape, thus facilitating evolution. This first aspect of the Baldwin effect has recently received much attention, especially for its applications to computational problem solving. In evolutionary algorithms, local search is analogous to individual learning. Improvements found via local search change the fitness of an individual without changing the actual genotype. Baldwin further proposed that abilities that initially require learning are eventually replaced by the evolution of genetically determined systems that do not require learning. Thus learned behaviours may become instinctive behaviours in subsequent generations, without appealing to Lamarckian inheritance. This aspect of the Baldwin effect deserves more attention. Recent work suggests that intuitive abilities in human language, physics, biology, and arithmetic may be largely instinctive. The Baldwin effect can help us understand the relationship between learning and instinct. Furthermore, increased understanding of this second aspect of the Baldwin effect may enable us to improve the performance of computational problem solving by hybrids of genetic algorithms and local search algorithms. A special issue of Evolutionary Computation is planned for 1996, the 100th anniversary of Baldwin's paper. Evolutionary Computation provides an international forum for facilitating and enhancing the exchange of information among researchers involved in both the theoretical and practical aspects of computational systems of an evolutionary nature. Papers are solicited that address both theoretical and computational work related to evolution, learning, instinct and the Baldwin Effect. Examples of topics of interest include: * When is learning advantageous? Learning can facilitate evolution by allowing individuals to more quickly adapt to fitness landscapes that would otherwise be difficult to exploit; however, the ability to learn weakens the selective forces acting on an individual, which can slow evolutionary change. * When is instinctive behaviour advantageous? Instincts can be fast and dependable in a static environment, although they may not be able to cope with radical environmental change. Thus the trade off between instinctual and learned responses may have implications for search and learning in dynamic computational environments. * From a design perspective, what parts of an individual's cognitive machinery should be modifiable by experience (local search) and what parts should be determined by evolution (genetic search)? For example, in a typical hybrid of a genetic algorithm and a neural network, the genetic algorithm determines the network architecture and back propagation determines the network weights. Is there a Baldwinian justification for this division of labour? * How does learned behaviour become instinctive? It seems plausible that, for some learned behaviours, there is no evolutionary path that leads to an instinctive replacement for the behaviour. For computational problem solving with hybrid genetic algorithms, what techniques can we use to encourage learned behaviours to evolve into instinctive behaviours? Further Information For more information, see "http://ai.iit.nrc.ca/baldwin/cfp.html" on the World Wide Web or send a message to the contact address listed below. Instructions for Submitting Papers Papers should describe mature work that is original in nature and has not been published elsewhere. Manuscripts should be approximately 8,000 to 12,000 words in length and formatted for 8 1/2 x 11-inch paper, single-sided and double-spaced. The first page should include the title, abstract, key words, and author information (name, affiliation, mailing address, telephone number, and e-mail address). The text of the paper should begin on the second page and continue on consecutively numbered pages. For more information on the format, consult the inside back cover of a recent issue of Evolutionary Computation. Send five hard copies (not faxes) to the contact address listed below. Electronic submissions in PostScript are also acceptable. Important Dates Manuscripts due: February 1, 1996 Acceptance notification: May 1, 1996 Final manuscript due: August 1, 1996 Planned Publication date of issue: December 1996 Guest Editors Peter D. Turney, National Research Council, Canada Darrell Whitley, Colorado State University, USA Russell W. Anderson, University of California, USA Contact Address Dr. Peter Turney Knowledge Systems Laboratory Institute for Information Technology National Research Council Canada Ottawa, Ontario, Canada K1A 0R6 (613) 993-8564 (office) (613) 952-7151 (fax) peter@ai.iit.nrc.ca ------------------------------ Subject: NNCM-95 Date: Mon, 17 Apr 1995 20:33:50 BST From: Jerry Connor SECOND ANNOUNCEMENT AND CALL FOR PAPERS NNCM-95 Third International Conference On NEURAL NETWORKS IN THE CAPITAL MARKETS Thursday-Friday, October 12-13, 1995 with tutorials on Wednesday, October 11, 1995. The Langham Hilton, London, England. (Note Deadline for Camera Ready Full Papers To Be Published In Hard Back Conference Proceedings) Neural networks are now emerging as a major modeling methodology in financial engineering. Because of the overwhelming interest in the NNCM workshops held in London in 1993 and Pasadena in 1994, the third annual NNCM conference will be held on October 12-13, 1995, in London. NNCM*95 will take a critical look at state of the art neural network applications in finance. This is a research meeting where original, high-quality contribu- tions to the field are presented and discussed. In addition, a day of introductory tutorials (Wednesday, October 11) will be included to familiarise audiences of different backgrounds with financial engineering, neural networks, and the mathematical aspects of the field. Application areas include: + Bond and stock valuation and trading + Foreign exchange rate prediction and trading + Commodity price forecasting + Risk management + Tactical asset allocation + Portfolio management + Option Pricing + Trading strategies Technical areas include, but are not limited to: + Neural networks + Nonparametric statistics + Econometrics + Pattern recognition + Time series analysis + Model Selection + Signal processing and control + Genetic and evolutionary algorithms + Fuzzy systems + Expert systems + Machine learning Instructions for Authors Authors who wish to present a paper should mail a copy of their extended abstract (4 pages, single-sided, single-spaced) typed on A4 (8.5" by 11") paper to the secretariat no later than May 31, 1995. Submissions will be refereed by no less than four referees, and authors will be notified of acceptance by 14 June 1995. Authors of accepted paperes will be mailed guidelines for producing final camera ready papers which are due 12 July 1995. The accepted papers will be published in a hard back conference proceedings published by World Scientific. Separate registration is required using the attached registration form. Authors are encouraged to submit abstracts as soon as possible. Registration To register, complete the registration form and mail to the sec- retariat. Please note that attendance is limited and will be allocated on a "first-come, first-served" basis. Secretariat: For further information, please contact the NNCM-95 secretariat: Ms Busola Oguntula, London Business School Sussex Place, Regent's Park, London NW1 4SA, UK e-mail: boguntula@lbs.lon.ac.uk phone (+44) (0171) 262 50 50 fax (+44) (0171) 724 78 75 Location: The main conference will be held at The Langham Hilton, which is situated near Regent's Park and is a short walk from Baker Street Underground Station. Further directions including a map will be sent to all registries. Programme Commitee Dr A. Refenes, London Business School (Chairman) Dr Y. Abu-Mostafa, Caltech Dr A. Atiya, Cairo University Dr N. Biggs, London School of Economics Dr D. Bunn, London Business School Dr M. Jabri, University of Sydney Dr B. LeBaron, University of Wisconsin Dr A. Lo, MIT Sloan School Dr J. Moody, Oregon Graduate Institute Dr C. Pedreira, Catholic University PUC-Rio Dr M. Steiner, Universitaet Munster Dr A. Timermann, University of California, San Diego Dr A. Weigend, University of Colorado Dr H. White, University of California, San Diego Hotel Accommodation: Convenient hotels include: The Langham Hilton 1 Portland Place London W1N 4JA Tel: (+44) (0171) 636 10 00 Fax: (+44) (0171) 323 23 40 Sherlock Holmes Hotel 108 Baker Street, London NW1 1LB Tel: (+44) (0171) 486 61 61 Fax: (+44) (0171) 486 08 84 The White House Hotel Albany St., Regent's Park, London NW1 Tel: (+44) (0171) 387 12 00 Fax: (+44) (0171) 388 00 91 --------------------------Registration Form -------------------------- -- NNCM-95 Registration Form Third International Conference on Neural Networks in the Capital Markets October 12-13 1995 Name:____________________________________________________ Affiliation:_____________________________________________ Mailing Address: ________________________________________ _________________________________________________________ Telephone:_______________________________________________ ****Please circle the applicable fees and write the total below**** Main Conference (October 12-13): (British Pounds) Registration fee 450 Discounted fee for academicians 250 (letter on university letterhead required) Discounted fee for full-time students 100 (letter from registrar or faculty advisor required) Tutorials (October 11): You must be registered for the main conference in order to register for the tutorials. (British Pounds) Morning Session Only 100 Afternoon Session Only 100 Both Sessions 150 Full-time students 50 (letter from registrar or faculty advisor required) TOTAL: _________ Payment may be made by: (please tick) ____ Check payable to London Business School ____ VISA ____Access ____American Express Card Number:___________________________________ ------------------------------ Date: Thu, 27 Apr 95 14:33:09 PDT From: lpease@admin.ogi.edu Subject: Neural Networks short course Oregon Graduate Institute of Science & Technology, Office of Continuing Education, offers the short course: NEURAL NETWORKS: ALGORITHMS AND APPLICATIONS June 12-16, 1995, at the OGI campus near Portland, Oregon. Course Organizer: John E. Moody Lead Instructor: Hong Pi With Lectures By: Dan Hammerstrom Todd K. Leen John E. Moody Thorsteinn S. Rognvaldsson Eric A. Wan Artificial neural networks (ANN) have emerged as a new information processing technique and an effective computational model for solving pattern recognition and completion, feature extraction, optimization, and function approximation problems. This course introduces participants to the neural network paradigms and their applications in pattern classification; system identification; signal processing and image analysis; control engineering; diagnosis; time series prediction; financial analysis and trading; and speech recognition. Designing a neural network application involves steps from data preprocessing to network tuning and selection. This course, with many examples, application demos and hands-on lab practice, will familiarize the participants with the techniques necessary for building successful applications. About 50 percent of the class time is assigned to lab sessions. The simulations will be based on Matlab, the Matlab Neural Net Toolbox, and other software running on 486 PCs. Prerequisites: Linear algebra and calculus. Previous experience with using Matlab is helpful, but not required. Who will benefit: Technical professionals, business analysts and other individuals who wish to gain a basic understanding of the theory and algorithms of neural computation and/or are interested in applying ANN techniques to real-world, data-driven modeling problems. Course Objectives: After completing the course, students will: - Understand the basic neural networks paradigms - Be familiar with the range of ANN applications - Have a good understanding of the techniques for designing successful applications - Gain hands-on experience with ANN modeling. Course Outline Neural Networks: Biological and Artificial The biological inspiration. History of neural computing. Types of architectures and learning algorithms. Application areas. Simple Perceptrons and Adalines Decision surfaces. Perceptron and Adaline learning rules. Stochastic gradient descent. Lab experiments. Multi-Layer Feed-Forward Networks I Multi-Layer Perceptrons. Back-propagation learning. Generalization. Early Stopping. Network performance analysis. Lab experiments. Multi-Layer Feed-Forward Networks II Radial basis function networks. Projection pursuit regression. Variants of back-propagation. Levenburg-Marquardt optimization. Lab experiments. Network Performance Optimization Network pruning techniques. Input variable selection. Sensitivity Analysis. Regularization. Lab experiments. Neural Networks for Pattern Recognition and Classification Nonparametric classification. Logistic regression. Bayesian approach. Statistical inference. Relation to other classification methods. Self-Organized Networks and Unsupervised Learning K-means clustering. Kohonen feature mapping. Learning vector quantization. Adaptive principal components analysis. Exploratory projection pursuit. Applications. Lab experiments. Time Series Prediction with Neural Networks Linear time series models. Nonlinear approaches. Case studies: economic and financial time series analysis. Lab experiments. Neural Network for Adaptive Control Nonlinear modeling in control. Neural network representations for dynamical systems. Reinforcement learning. Applications. Lab Experiments. Massively Parallel Implementation of Neural Nets on the Desktop Architecture and application demos of the Adaptive Solutions' CNAPS System. Current State of Research and Future Directions About the Instructors Dan Hammerstrom received the B.S. degree in Electrical Engineering, with distinction, from Montana State University, the M.S. degree in Electrical Engineering from Stanford University, and the Ph.D. degree in Electrical Engineering from the University of Illinois. He was on the faculty of Cornell University from 1977 to 1980 as an assistant professor. From 1980 to 1985 he worked for Intel where he participated in the development and implementation of the iAPX-432 and i960 and, as a consultant, the iWarp systolic processor that was jointly developed by Intel and Carnegie Mellon University. He is an associate professor at Oregon Graduate Institute where he is pursuing research in massively parallel VLSI architectures, and is the founder and Chief Technical Officer of Adaptive Solutions, Inc. He is the architect of the Adaptive Solutions CNAPS neurocomputer.Dr. Hammerstrom's research interests are in the area of the VLSI architectures for pattern recognition. Todd K. Leen is associate professor of Computer Science and Engineering at Oregon Graduate Institute of Science & Technology. He received his Ph.D. in theoretical Physics from the University of Wisconsin in 1982. From 1982-1987 he worked at IBM Corporation, and then pursued research in mathematical biology at Good Samaritan Hospital's Neurological Sciences Institute. He joined OGI in 1989. Dr. Leen's current research interests include neural learning, algorithms and architectures, stochastic optimization, model constraints and pruning, and neural and non-neural approaches to data representation and coding. He is particularly interested in fast, local modeling approaches, and applications to image and speech processing. Dr. Leen served as theory program chair for the 1993 Neural Information Processing Systems (NIPS) conference, and workshops chair for the 1994 NIPS conference. John E. Moody is associate professor of Computer Science and Engineering at Oregon Graduate Institute of Science & Technology. His current research focuses on neural network learning theory and algorithms in it's many manifestations. He is particularly interested in statistical learning theory, the dynamics of learning, and learning in dynamical contexts. Key application areas of his work are adaptive signal processing, adaptive control, time series analysis, forecasting, economics and finance. Moody has authored over 35 scientific papers, more than 25 of which concern the theory, algorithms, and applications of neural networks. Prior to joining the Oregon Graduate Institute, Moody was a member of the Computer Science and Neuroscience faculties at Yale University. Moody received his Ph.D. and M.A. degrees in Theoretical Physics from Princeton University, and graduated Summa Cum Laude with a B.A. in Physics from the University of Chicago. Hong Pi is a senior research associate at Oregon Graduate Institute. He received his Ph.D. in theoretical physics from University of Wisconsin. His research interests include nonlinear modeling, neural network algorithms and applications. Thorsteinn S. Rognvaldsson received the Ph.D. degree in theoretical physics from Lund University, Sweden, in 1994. His research interests are Neural Networks for prediction and classification. He is currently a postdoctoral research associate at Oregon Graduate Institute. Eric A. Wan, Assistant Professor of Electrical Engineering and Applied Physics, Oregon Graduate Institute of Science & Technology, received his Ph.D. in electrical engineering from Stanford University in 1994. His research interests include learning algorithms and architectures for neural networks and adaptive signal processing. He is particularly interested in neural applications to time series prediction, speech enhancement, system identification, and adaptive control. He is a member of IEEE, INNS, Tau Beta Pi, Sigma Xi, and Phi Beta Kappa. Course Dates: M-F, June 12-16, 1995, 8:30am-5pm Course fee: $1695 (includes instruction, course materials, labs, break refreshments and lunches, Monday night reception and Thursday night dinner) For a complete course brochure contact: Linda M. Pease, Director Office of Continuing Education Oregon Graduate Institute of Science & Technology PO Box 91000 Portland, OR 97291-1000 +1-503-690-1259 +1-503-690-1686 (fax) e-mail: continuinged@admin.ogi.edu WWW home page: http://www.ogi.edu ------------------------------ End of ML-LIST (Digest format) **************************************** From uzimmer@informatik.uni-kl.de Sun Apr 30 19:43:24 1995 Received: from lucy.cs.wisc.edu by sea.cs.wisc.edu; Sun, 30 Apr 95 19:43:21 -0500; AA22425 Received: from TELNET-1.SRV.CS.CMU.EDU by lucy.cs.wisc.edu; Sun, 30 Apr 95 19:43:19 -0500 Received: from TELNET-1.SRV.CS.CMU.EDU by telnet-1.srv.cs.CMU.EDU id aa07754; 30 Apr 95 17:51:16 EDT Received: from DST.BOLTZ.CS.CMU.EDU by TELNET-1.SRV.CS.CMU.EDU id aa07752; 30 Apr 95 17:36:24 EDT Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa16766; 30 Apr 95 17:35:28 EDT Received: from CS.CMU.EDU by B.GP.CS.CMU.EDU id aa16680; 27 Apr 95 11:44:54 EDT Received: from [131.246.136.50] by CS.CMU.EDU id aa25085; 27 Apr 95 9:40:43 EDT Received: from uklirb.informatik.uni-kl.de by stepsun.uni-kl.de id ab24903; 27 Apr 95 15:36 MET DST Received: from ag-vp-file-server.informatik.uni-kl.de by uklirb.uklirb.informatik.uni-kl.de id aa23804; 27 Apr 95 15:28 MET DST Subject: Revised paper and www-server available (Mobile Robots, Self-Localization, Neural Nets) To: connectionists@cs.cmu.edu, Neuron@cattell.psych.upenn.edu From: "Uwe R. Zimmer, AG vP" Date: Thu, 27 Apr 95 15:28:51 +0100 Reply-To: uzimmer@informatik.uni-kl.de Message-Id: <950427.152851.1770@ag-vp-file-server.informatik.uni-kl.de> This report was announced already a week ago, but some ps-problems prevented the printing of the photos on some printers. (sorry) - this version should work well on all printers. --------------------------------------------------------------------------- A report on a current mobile robot project concerning basic mobile robot tasks is available via ftp or (together with some other reports) from the following WWW-server: WWW-Server is: http://ag-vp-www.informatik.uni-kl.de/ --------------------------------------------------------------------------- --- Self-Localization in Dynamic Environments --------------------------------------------------------------------------- FTP-Server is: ftp.uni-kl.de Mode is : binary Directory is : reports_uni-kl/computer_science/mobile_robots/1995/papers File name is : Zimmer.Self-Loc.ps.gz IEEE/SOFT International Workshop BIES'95 May 30 - 31, 1995, Tokyo, Japan Self-Localization in Dynamic Environments Uwe R. Zimmer Self-localization in unknown environments respectively correlation of current and former impressions of the world is an essential ability for most mobile robots. The method, proposed in this article is the construction of a qualitative, topological world model as a basis for self-localization. As a central aspect the reliability regarding error-tolerance and stability will be emphasized. The proposed techniques demand very low constraints for the kind and quality of the employed sensors as well as for the kinematic precision of the utilized mobile platform. Hard real-time constraints can be handled due to the low computational complexity. The principal discussions are supported by real-world experiments with the mobile robot "ALICE". keywords: artificial neural networks, mobile robots, self-localization, self-organization, world-modelling (8 pages with photos and other figures) ----------------------------------------------------- ----- Uwe R. Zimmer --- University of Kaiserslautern - Computer Science Department | 67663 Kaiserslautern - Germany | ------------------------------.--------------------------------. Phone:+49 631 205 2624 | Fax:+49 631 205 2803 |