Oct 31, 2003 ------------- - Recap neural networks and decision trees - DTs are greedy, may not lead to "optimal tree" - NNs are iterative improvement, also can get stuck in local minima - A more formal way to think of learning - version spaces - piston analogy - Version spaces - maintain most specific boundary and most general boundary, and move them toward each other - If data is consistent (given the bias) - pistons will move toward each other to enclose a set of consistent hypotheses - If data is inconsistent (for the particular bias) - pistons will cross each other - Can arrange hypotheses as a lattice - example bias: boolean conjunctions of at most three literals (positive or negative) - Worked out examples with version spaces - Next topic: Knowledge in Learning - learning when you already know something - combine deduction and induction - Induction - proceeding from specific to general - e.g., decision trees, neural nets - Deduction - proceeding from general to specific - e.g., proving theorems, resolution - How to combine these two? - many flavors here! - Example 1: - Type A - Explanation Based Learning - "I was wondering if %s' - Example 2: - Type B - Relevance Based Learning - "In India, elevators are called lifts." - Example 3: - Type C - Knowledge-Based Inductive Learning - "Naren is a good teacher."