David Bingham Skalak



Visitor
Department of Computer Science
Cornell University
Ithaca, NY 14853-7501
email: skalak@cs.cornell.edu
map: Ithaca and the Cornell Campus


Biographical Sketch

I received a B.S. in mathematics summa cum laude from Union College (1976), an M.A. in mathematics from Dartmouth College (1979), a J.D. from Harvard Law School (1982), an M.S. (1989) and a Ph.D. (1997) in computer science from the University of Massachusetts at Amherst. I was a Fulbright Fellow at the University of Southampton, England, and have studied at the University of St. Andrews, St. Andrews, Scotland.

I'm currently a Senior Data Mining Analyst with IBM.

My current research interests include instance-based and local learning algorithms, classifier combination, case-based reasoning, AI and law, and the application of machine learning algorithms and knowledge discovery methods to equity selection, money management and market timing.


Curriculum Vitae

Academic CV
CV detailing financial experience

Editorial Activity

Editorial Advisory Board, Journal of Computational Intelligence in Finance, 1996--2000.

Selected Publications

Instance Sampling for Boosted and Standalone Nearest Neighbor Classifiers.
To appear, Instance Selection and Construction: A Data Mining Perspective, edited by H.~Motoda and H.~Liu, published by Kluwer Academic Publishers.

Prototype Selection for Composite Nearest Neighbor Classifiers.
Ph.D. dissertation. Dept. of Computer Science, Technical Report 96-89, University of Massachusetts, Amherst, Massachusetts. ( postscript, 2464K ). Thesis also available here. ( compressed postscript, 758K ).

The Sources of Increased Accuracy for Two Proposed Boosting Algorithms.
Proceedings of the AAAI-96 Integrating Multiple Learned Models Workshop, Portland, OR, American Association for Artificial Intelligence, Menlo Park, CA, 1996 ( compressed postscript, 67K ).

Prototype Selection for Composite Nearest Neighbor Classifiers.
Dissertation Proposal. Dept. of Computer Science, Technical Report 95-74, University of Massachusetts, Amherst, Massachusetts. (compressed postscript, 325K) . Abstract.

Prototype and Feature Selection by Sampling and Random Mutation Hill-Climbing Algorithms.
Proceedings of the Eleventh International Conference on Machine Learning, pp. 293-301, New Brunswick, New Jersey, 1994. ( postscript, 153K ).

Using a Genetic Algorithm to Learn Prototypes for Case Retrieval and Classification.
Proceedings of the AAAI-93 Case-Based Reasoning Workshop (Technical Report WS-93-01), pp. 64-69, Washington, D.C., American Assocation for Artificial Intelligence, Menlo Park, CA, 1994. ( Binhexed Macintosh Microsoft Word file ).


Survival Guides

How to Succeed in Graduate School: A Guide for Students and Advisors
Marie desJardins, Crossroads 1.2, December, 1994, and Crossroads 1.3, February, 1995.
( Excellent. Check out the references and the appendix entitled "How to be a Terrible Thesis Advisor". Also available in other formats from http://www.erg.sri.com/people/marie/papers/advice-summary . )

A Ph.D. Is Not Enough: A Guide to Survival in Science
Peter J. Feibelman, Addison-Wesley, Reading, MA, 1993. ISBN 0-201-62663-2.
( Worth reading well before you receive your Ph.D. )

Getting What You Came For: The Smart Student's Guide to Earning a Master's or a Ph.D.
Robert L. Peters, Noonday Press, Farrar Straus and Giroux, New York, NY, 1992.
ISBN 0-374-52361-4.
( Don't let the hokey title deter you; this is a wonderful book. )

Notes on Presenting Theses
Aaron Sloman, unpublished manuscript, February, 1992.
( Read this before you start writing. Especially useful tips for describing large computer programs. )


Other People and Places to Visit

Claire Cardie

skalak@cs.cornell.edu