Title: Enabling MT for Languages with Limited Resources
1Enabling MT for Languages with Limited Resources
- Alon Lavie and Lori Levin
- Language Technologies Institute
- Carnegie Mellon University
2Progression of MT
- Started with rule-based systems
- Very large expert human effort to construct
language-specific resources (grammars, lexicons) - High-quality MT extremely expensive ? only for
handful of language pairs - Along came EBMT and then SMT
- Replaced human effort with extremely large
volumes of parallel text data - Less expensive, but still only feasible for a
small number of language pairs - We traded human labor with data
- Where does this take us in 5-10 years?
- Large parallel corpora for maybe 25-50 language
pairs - What about all the other languages?
- Is all this data (with very shallow
representation of language structure) really
necessary? - Can we build MT approaches that learn deeper
levels of language structure and how they map
from one language to another?
3Why Machine Translation for Languages with
Limited Resources?
- We are in the age of information explosion
- The internetwebGoogle ? anyone can get the
information they want anytime - But what about the text in all those other
languages? - How do they read all this English stuff?
- How do we read all the stuff that they put
online? - MT for these languages would Enable
- Better government access to native indigenous and
minority communities - Better minority and native community
participation in information-rich activities
(health care, education, government) without
giving up their languages. - Civilian and military applications (disaster
relief) - Language preservation
4The Roadmap to Learning-based MT
- Automatic acquisition of necessary language
resources and knowledge using machine learning
methodologies - Learning morphology (analysis/generation)
- Rapid acquisition of broad coverage word-to-word
and phrase-to-phrase translation lexicons - Learning of syntactic structural mappings
- Tree-to-tree structure transformations Knight et
al, Eisner, Melamed require parse trees for
both languages - Learning syntactic transfer rules with resources
(grammar, parses) for just one of the two
languages - Automatic rule refinement and/or post-editing
- Effective integration of acquired knowledge with
statistical/distributional information
5CMUs AVENUE Approach
- Elicitation use bilingual native informants to
produce a small high-quality word-aligned
bilingual corpus of translated phrases and
sentences - Transfer-rule Learning apply ML-based methods to
automatically acquire syntactic transfer rules
for translation between the two languages - Learn from major language to minor language
- Translate from minor language to major language
- XFER Decoder
- XFER engine produces a lattice of all possible
transferred structures at all levels - Decoder searches and selects the best scoring
combination - Rule Refinement refine the acquired rules via a
process of interaction with bilingual informants - Morphology Learning
- Word and Phrase bilingual lexicon acquisition
6AVENUE Architecture
7Learning Transfer-Rules for Languages with
Limited Resources
- Rationale
- Bilingual native informant(s) can translate and
align a small pre-designed elicitation corpus,
using elicitation tool - Elicitation corpus designed to be typologically
and structurally comprehensive and compositional - Transfer-rule engine and new learning approach
support acquisition of generalized transfer-rules
from the data
8Transfer Rule Formalism
SL the old man, TL ha-ish ha-zaqen NPNP
DET ADJ N -gt DET N DET ADJ ( (X1Y1) (X1Y3)
(X2Y4) (X3Y2) ((X1 AGR) 3-SING) ((X1 DEF
DEF) ((X3 AGR) 3-SING) ((X3 COUNT)
) ((Y1 DEF) DEF) ((Y3 DEF) DEF) ((Y2 AGR)
3-SING) ((Y2 GENDER) (Y4 GENDER)) )
- Type information
- Part-of-speech/constituent information
- Alignments
- x-side constraints
- y-side constraints
- xy-constraints,
- e.g. ((Y1 AGR) (X1 AGR))
9Rule Learning - Overview
- Goal Acquire Syntactic Transfer Rules
- Use available knowledge from the source side
(grammatical structure) - Three steps
- Flat Seed Generation first guesses at transfer
rules flat syntactic structure - Compositionality use previously learned rules to
add hierarchical structure - Constraint Learning refine rules by learning
appropriate feature constraints
10Flat Seed Rule Generation
11Compositionality
12Constraint Learning
13AVENUE Prototypes
- General XFER framework under development for past
two years - Prototype systems so far
- German-to-English, Spanish-to-English
- Hindi-to-English, Hebrew-to-English
- In progress or planned
- Mapudungun-to-Spanish
- Quechua-to-Spanish
- Arabic-to-English
- Native-Brazilian languages to Brazilian Portuguese
14Morphology Learning
- Unsupervised learning of morphemes and their
function from raw monolingual data - Segmentation of words into morphemes
- Identification of morphological paradigms
(inflections and derivations) - Learning association between morphemes and their
function in the language
15- Morphology Learning
- AVENUE Approach
- Organize the raw data in the
- form of a network of paradigm
- candidate schemes
- Search the network for a
- collection of schemes that
- represent true morphology
- paradigms of the language
- Learn mappings between the
- schemes and features/functions
- using minimal pairs of elicited
- data
- Construct analyzer based on the
- collection of schemes and the
- acquired function mappings
16(No Transcript)
17Automated Rule Refinement
- Rationale
- Bilingual informants can identify translation
errors and pinpoint the errors - A sophisticated trace of the translation path can
identify likely sources for the error and do
Blame Assignment - Rule Refinement operators can be developed to
modify the underlying translation grammar (and
lexicon) based on characteristics of the error
source - Add or delete feature constraints from a rule
- Bifurcate a rule into two rules (general and
specific) - Add or correct lexical entries
18Missing Science
- Monolingual learning tasks
- Learning morphology morphemes and their meaning
- Learning syntactic and semantic structures
grammar induction - Bilingual Learning Tasks
- Automatic acquisition of word and phrase
translation lexicons - Learning structural mappings (syntactic and
semantic) - Models that effectively combine learned symbolic
knowledge with statistical information new
decoders
19(No Transcript)
20English-Chinese Example
21English-Hindi Example
22Spanish-Mapudungun Example
23English-Arabic Example
24Transfer Rule Formalism (II)
SL the old man, TL ha-ish ha-zaqen NPNP
DET ADJ N -gt DET N DET ADJ ( (X1Y1) (X1Y3)
(X2Y4) (X3Y2) ((X1 AGR) 3-SING) ((X1 DEF
DEF) ((X3 AGR) 3-SING) ((X3 COUNT)
) ((Y1 DEF) DEF) ((Y3 DEF) DEF) ((Y2 AGR)
3-SING) ((Y2 GENDER) (Y4 GENDER)) )
- Value constraints
-
- Agreement constraints
25AVENUE Partners
26The Transfer Engine
27Seeded VSL Some Open Issues
- Three types of constraints
- X-side constrain applicability of rule
- Y-side assist in generation
- X-Y transfer features from SL to TL
- Which of the three types improves translation
performance? - Use rules without features to populate lattice,
decoder will select the best translation - Learn only X-Y constraints, based on list of
universal projecting features - Other notions of version-spaces of feature
constraints - Current feature learning is specific to rules
that have identical transfer components - Important issue during transfer is to
disambiguate among rules that have same SL side
but different TL side can we learn effective
constraints for this?
28Examples of Learned Rules (Hindi-to-English)
29XFER MT for Hebrew-to-English
- Two month intensive effort to apply our XFER
approach to the development of a
Hebrew-to-English MT system - Challenges
- No large parallel corpus
- Limited coverage translation lexicon
- Rich Morphology incomplete analyzer available
- Accomplished
- Collected available resources, establish
methodology for processing Hebrew input - Translated and aligned Elicitation Corpus
- Learned XFER rules
- Developed (small) manual XFER grammar as a point
of comparison - System debugging and development
- Evaluated performance on unseen test data using
automatic evaluation metrics
30(No Transcript)
31Morphology Example
- Input word BWRH
- 0 1 2 3 4
- --------BWRH--------
- -----B-----WR--H--
- --B---H----WRH---
-
32Morphology Example
- Y0 ((SPANSTART 0) Y1 ((SPANSTART 0)
Y2 ((SPANSTART 1) - (SPANEND 4) (SPANEND
2) (SPANEND 3) - (LEX BWRH) (LEX B)
(LEX WR) - (POS N) (POS
PREP)) (POS N) - (GEN F)
(GEN M) - (NUM S)
(NUM S) - (STATUS ABSOLUTE))
(STATUS ABSOLUTE)) - Y3 ((SPANSTART 3) Y4 ((SPANSTART 0)
Y5 ((SPANSTART 1) - (SPANEND 4) (SPANEND
1) (SPANEND 2) - (LEX LH) (LEX
B) (LEX H) - (POS POSS)) (POS
PREP)) (POS DET)) - Y6 ((SPANSTART 2) Y7 ((SPANSTART 0)
- (SPANEND 4) (SPANEND
4) - (LEX WRH) (LEX
BWRH) - (POS N) (POS
LEX)) - (GEN F)
- (NUM S)
33Sample Output (dev-data)
- maxwell anurpung comes from ghana for israel four
years ago and since worked in cleaning in hotels
in eilat - a few weeks ago announced if management club
hotel that for him to leave israel according to
the government instructions and immigration
police - in a letter in broken english which spread among
the foreign workers thanks to them hotel for
their hard work and announced that will purchase
for hm flight tickets for their countries from
their money
34Evaluation Results
- Test set of 62 sentences from Haaretz newspaper,
2 reference translations
35Future Directions
- Continued work on automatic rule learning
(especially Seeded Version Space Learning) - Use Hebrew and Hindi systems as test platforms
for experimenting with advanced learning research - Rule Refinement via interaction with bilingual
speakers - Developing a well-founded model for assigning
scores (probabilities) to transfer rules - Redesigning and improving decoder to better fit
the specific characteristics of the XFER model - Improved leveraging from manual grammar resources
- MEMT with improved
- Combination of output from different translation
engines with different confidence scores - strong decoding capabilities
36Flat Seed Generation
- Create a transfer rule that is specific to the
sentence pair, but abstracted to the POS level.
No syntactic structure.
37Compositionality - Overview
- Traverse the c-structure of the English sentence,
add compositional structure for translatable
chunks - Adjust constituent sequences, alignments
- Remove unnecessary constraints, i.e. those that
are contained in the lower-level rule
38Seeded Version Space Learning Overview
- Goal add appropriate feature constraints to the
acquired rules - Methodology
- Preserve general structural transfer
- Learn specific feature constraints from example
set - Seed rules are grouped into clusters of similar
transfer structure (type, constituent sequences,
alignments) - Each cluster forms a version space a partially
ordered hypothesis space with a specific and a
general boundary - The seed rules in a group form the specific
boundary of a version space - The general boundary is the (implicit) transfer
rule with the same type, constituent sequences,
and alignments, but no feature constraints -
39Seeded Version Space Learning Generalization
- The partial order of the version space
- Definition A transfer rule tr1 is strictly more
general than another transfer rule tr2 if all
f-structures that are satisfied by tr2 are also
satisfied by tr1. - Generalize rules by merging them
- Deletion of constraint
- Raising two value constraints to an agreement
constraint, e.g. - ((x1 num) pl), ((x3 num) pl) ?
- ((x1 num) (x3 num))
40Seeded Version Space Learning
-
-
- NP v det n NP VP
- Group seed rules into version spaces as above.
- Make use of partial order of rules in version
space. Partial order is defined - via the f-structures satisfying the constraints.
- Generalize in the space by repeated merging of
rules - Deletion of constraint
- Moving value constraints to agreement
constraints, e.g. - ((x1 num) pl), ((x3 num) pl) ?
- ((x1 num) (x3 num)
- 4. Check translation power of generalized rules
against sentence pairs
41Seeded Version Space LearningThe Search
- The Seeded Version Space algorithm itself is the
repeated generalization of rules by merging - A merge is successful if the set of sentences
that can correctly be translated with the merged
rule is a superset of the union of sets that can
be translated with the unmerged rules, i.e. check
power of rule - Merge until no more successful merges
42Conclusions
- Transfer rules (both manual and learned) offer
significant contributions that can complement
existing data-driven approaches - Also in medium and large data settings?
- Initial steps to development of a statistically
grounded transfer-based MT system with - Rules that are scored based on a well-founded
probability model - Strong and effective decoding that incorporates
the most advanced techniques used in SMT decoding - Working from the opposite end of research on
incorporating models of syntax into standard
SMT systems Knight et al - Our direction makes sense in the limited data
scenario
43AVENUE Architecture
Run-Time Module
Learning Module
SL Input
SL Parser
Morphology Pre-proc
Elicitation Process
Transfer Rule Learning
Transfer Rules
Transfer Engine
TL Output
TL Generator
Decoder
User
44Learning Transfer-Rules for Languages with
Limited Resources
- Rationale
- Large bilingual corpora not available
- Bilingual native informant(s) can translate and
align a small pre-designed elicitation corpus,
using elicitation tool - Elicitation corpus designed to be typologically
comprehensive and compositional - Transfer-rule engine and new learning approach
support acquisition of generalized transfer-rules
from the data
45The Elicitation Corpus
- Translated, aligned by bilingual informant
- Corpus consists of linguistically diverse
constructions - Based on elicitation and documentation work of
field linguists (e.g. Comrie 1977, Bouquiaux
1992) - Organized compositionally elicit simple
structures first, then use them as building
blocks - Goal minimize size, maximize linguistic coverage
46The Transfer Engine
47Transfer Rule Formalism
SL the man, TL der Mann NPNP DET N -gt
DET N ( (X1Y1) (X2Y2) ((X1 AGR)
3-SING) ((X1 DEF DEF) ((X2 AGR)
3-SING) ((X2 COUNT) ) ((Y1 AGR)
3-SING) ((Y1 DEF) DEF) ((Y2 AGR)
3-SING) ((Y2 GENDER) (Y1 GENDER)) )
- Type information
- Part-of-speech/constituent information
- Alignments
- x-side constraints
- y-side constraints
- xy-constraints,
- e.g. ((Y1 AGR) (X1 AGR))
48Transfer Rule Formalism (II)
SL the man, TL der Mann NPNP DET N -gt
DET N ( (X1Y1) (X2Y2) ((X1 AGR)
3-SING) ((X1 DEF DEF) ((X2 AGR)
3-SING) ((X2 COUNT) ) ((Y1 AGR)
3-SING) ((Y1 DEF) DEF) ((Y2 AGR)
3-SING) ((Y2 GENDER) (Y1 GENDER)) )
- Value constraints
-
- Agreement constraints
49Rule Learning - Overview
- Goal Acquire Syntactic Transfer Rules
- Use available knowledge from the source side
(grammatical structure) - Three steps
- Flat Seed Generation first guesses at transfer
rules flat syntactic structure - Compositionality use previously learned rules to
add hierarchical structure - Seeded Version Space Learning refine rules by
generalizing with validation (learn appropriate
feature constraints)
50Examples of Learned Rules (I)
51A Limited Data Scenario for Hindi-to-English
- Put together a scenario with miserly data
resources - Elicited Data corpus 17589 phrases
- Cleaned portion (top 12) of LDC dictionary
2725 Hindi words (23612 translation pairs) - Manually acquired resources during the SLE
- 500 manual bigram translations
- 72 manually written phrase transfer rules
- 105 manually written postposition rules
- 48 manually written time expression rules
- No additional parallel text!!
52Manual Grammar Development
- Covers mostly NPs, PPs and VPs (verb complexes)
- 70 grammar rules, covering basic and recursive
NPs and PPs, verb complexes of main tenses in
Hindi (developed in two weeks)
53Manual Transfer Rules Example
PASSIVE OF SIMPLE PAST (NO AUX) WITH LIGHT
VERB passive of 43 (7b) VP,28 VPVP V V
V -gt Aux V ( (X1Y2) ((x1 form) root)
((x2 type) c light) ((x2 form) part) ((x2
aspect) perf) ((x3 lexwx) 'jAnA') ((x3
form) part) ((x3 aspect) perf) (x0 x1)
((y1 lex) be) ((y1 tense) past) ((y1 agr
num) (x3 agr num)) ((y1 agr pers) (x3 agr
pers)) ((y2 form) part) )
54Manual Transfer Rules Example
NP PP NP1 NP P Adj N
N1 ke eka aXyAya N
jIvana
NP NP1 PP Adj N
P NP one chapter of N1
N life
NP1 ke NP2 -gt NP2 of NP1 Ex jIvana ke
eka aXyAya life of (one) chapter
gt a chapter of life NP,12 NPNP PP
NP1 -gt NP1 PP ( (X1Y2) (X2Y1) ((x2
lexwx) 'kA') ) NP,13 NPNP NP1 -gt
NP1 ( (X1Y1) ) PP,12 PPPP NP Postp
-gt Prep NP ( (X1Y2) (X2Y1) )
55Adding a Strong Decoder
- XFER system produces a full lattice
- Edges are scored using word-to-word translation
probabilities, trained from the limited bilingual
data - Decoder uses an English LM (70m words)
- Decoder can also reorder words or phrases (up to
4 positions ahead) - For XFER(strong) , ONLY edges from basic XFER
system are used!
56Testing Conditions
- Tested on section of JHU provided data 258
sentences with four reference translations - SMT system (stand-alone)
- EBMT system (stand-alone)
- XFER system (naïve decoding)
- XFER system with strong decoder
- No grammar rules (baseline)
- Manually developed grammar rules
- Automatically learned grammar rules
- XFERSMT with strong decoder (MEMT)
57Results on JHU Test Set (very miserly training
data)
58Effect of Reordering in the Decoder
59Observations and Lessons (I)
- XFER with strong decoder outperformed SMT even
without any grammar rules in the miserly data
scenario - SMT Trained on elicited phrases that are very
short - SMT has insufficient data to train more
discriminative translation probabilities - XFER takes advantage of Morphology
- Token coverage without morphology 0.6989
- Token coverage with morphology 0.7892
- Manual grammar currently somewhat better than
automatically learned grammar - Learned rules did not yet use version-space
learning - Large room for improvement on learning rules
- Importance of effective well-founded scoring of
learned rules
60Observations and Lessons (II)
- MEMT (XFER and SMT) based on strong decoder
produced best results in the miserly scenario. - Reordering within the decoder provided very
significant score improvements - Much room for more sophisticated grammar rules
- Strong decoder can carry some of the reordering
burden
61Conclusions
- Transfer rules (both manual and learned) offer
significant contributions that can complement
existing data-driven approaches - Also in medium and large data settings?
- Initial steps to development of a statistically
grounded transfer-based MT system with - Rules that are scored based on a well-founded
probability model - Strong and effective decoding that incorporates
the most advanced techniques used in SMT decoding - Working from the opposite end of research on
incorporating models of syntax into standard
SMT systems Knight et al - Our direction makes sense in the limited data
scenario
62Future Directions
- Continued work on automatic rule learning
(especially Seeded Version Space Learning) - Improved leveraging from manual grammar
resources, interaction with bilingual speakers - Developing a well-founded model for assigning
scores (probabilities) to transfer rules - Improving the strong decoder to better fit the
specific characteristics of the XFER model - MEMT with improved
- Combination of output from different translation
engines with different scorings - strong decoding capabilities
63Rule Learning - Overview
- Goal Acquire Syntactic Transfer Rules
- Use available knowledge from the source side
(grammatical structure) - Three steps
- Flat Seed Generation first guesses at transfer
rules no syntactic structure - Compositionality use previously learned rules to
add structure - Seeded Version Space Learning refine rules by
generalizing with validation
64Flat Seed Generation
- Create a transfer rule that is specific to the
sentence pair, but abstracted to the POS level.
No syntactic structure.
65Flat Seed Generation - Example
- The highly qualified applicant did not accept the
offer. - Der äußerst qualifizierte Bewerber nahm das
Angebot nicht an. - ((1,1),(2,2),(3,3),(4,4),(6,8),(7,5),(7,9),(8,6),(
9,7))
SS det adv adj n aux neg v det n -gt det adv
adj n v det n neg vpart (alignments (x1y1)(x2
y2)(x3y3)(x4y4)(x6y8)(x7y5)(x7y9)(x8
y6)(x9y7)) constraints ((x1 def) ) ((x4
agr) 3-sing) ((x5 tense) past) . ((y1
def) ) ((y3 case) nom) ((y4 agr)
3-sing) . )
66Compositionality - Overview
- Traverse the c-structure of the English sentence,
add compositional structure for translatable
chunks - Adjust constituent sequences, alignments
- Remove unnecessary constraints, i.e. those that
are contained in the lower-level rule - Adjust constraints use f-structure of correct
translation vs. f-structure of incorrect
translations to introduce context constraints
67Compositionality - Example
SS det adv adj n aux neg v det n -gt det adv
adj n v det n neg vpart (alignments (x1y1)(x2
y2)(x3y3)(x4y4)(x6y8)(x7y5)(x7y9)(x8
y6)(x9y7)) constraints ((x1 def) ) ((x4
agr) 3-sing) ((x5 tense) past) . ((y1
def) ) ((y3 case) nom) ((y4 agr)
3-sing) . )
NPNP det AJDP n -gt det ADJP
n ((x1y1) ((y3 agr) 3-sing) ((x3 agr
3-sing) .)
SS NP aux neg v det n -gt NP v det n neg
vpart (alignments (x1y1)(x3y5)(x4y2)(x4
y6)(x5y3)(x6y4) constraints ((x2 tense)
past) . ((y1 def) ) ((y1 case) nom) .
)
68Seeded Version Space Learning Overview
- Goal further generalize the acquired rules
- Methodology
- Preserve general structural transfer
- Consider relaxing specific feature constraints
- Seed rules are grouped into clusters of similar
transfer structure (type, constituent sequences,
alignments) - Each cluster forms a version space a partially
ordered hypothesis space with a specific and a
general boundary - The seed rules in a group form the specific
boundary of a version space - The general boundary is the (implicit) transfer
rule with the same type, constituent sequences,
and alignments, but no feature constraints -
69Seeded Version Space Learning
-
-
- NP v det n NP VP
- Group seed rules into version spaces as above.
- Make use of partial order of rules in version
space. Partial order is defined - via the f-structures satisfying the constraints.
- Generalize in the space by repeated merging of
rules - Deletion of constraint
- Moving value constraints to agreement
constraints, e.g. - ((x1 num) pl), ((x3 num) pl) ?
- ((x1 num) (x3 num)
- 4. Check translation power of generalized rules
against sentence pairs
70Seeded Version Space Learning Example
SS NP aux neg v det n -gt NP v det n neg
vpart (alignments (x1y1)(x3y5)(x4y2)(x4
y6)(x5y3)(x6y4) constraints ((x2 tense)
past) . ((y1 def) ) ((y1 case) nom)
((y1 agr) 3-sing) ) ((y3 agr) 3-sing)
((y4 agr) 3-sing) )
SS NP aux neg v det n -gt NP n det n neg
vpart ( alignments (x1y1)(x3y5) (x4y2)(x
4y6) (x5y3)(x6y4) constraints ((x2
tense) past) ((y1 def) ) ((y1 case)
nom) ((y4 agr) (y3 agr)) )
SS NP aux neg v det n -gt NP v det n neg
vpart (alignments (x1y1)(x3y5)(x4y2)(x4
y6)(x5y3)(x6y4) constraints ((x2 tense)
past) ((y1 def) ) ((y1 case) nom) ((y1
agr) 3-plu) ((y3 agr) 3-plu) ((y4 agr)
3-plu) )
71Preliminary Evaluation
- English to German
- Corpus of 141 ADJPs, simple NPs and sentences
- 10-fold cross-validation experiment
- Goals
- Do we learn useful transfer rules?
- Does Compositionality improve generalization?
- Does VS-learning improve generalization?
72Summary of Results
- Average translation accuracy on cross-validation
test set was 62 - Without VS-learning 43
- Without Compositionality 57
- Average number of VSs 24
- Average number of sents per VS 3.8
- Average number of merges per VS 1.6
- Percent of compositional rules 34
73Conclusions
- New paradigm for learning transfer rules from
pre-designed elicitation corpus - Geared toward languages with very limited
resources - Preliminary experiments validate approach
compositionality and VS-learning improve
generalization
74Future Work
- Larger, more diverse elicitation corpus
- Additional languages (Mapudungun)
- Less information on TL side
- Reverse translation direction
- Refine the various algorithms
- Operators for VS generalization
- Generalization VS search
- Layers for compositionality
- User interactive verification
75Seeded Version Space Learning Generalization
- The partial order of the version space
- Definition A transfer rule tr1 is strictly more
general than another transfer rule tr2 if all
f-structures that are satisfied by tr2 are also
satisfied by tr1. - Generalize rules by merging them
- Deletion of constraint
- Raising two value constraints to an agreement
constraint, e.g. - ((x1 num) pl), ((x3 num) pl) ?
- ((x1 num) (x3 num))
76Seeded Version Space Learning Merging Two Rules
- Merging algorithm proceeds in three steps.
- To merge tr1 and tr2 into trmerged
- Copy all constraints that are both in tr1 and tr2
into trmerged - Consider tr1 and tr2 separately. For the
remaining constraints in tr1 and tr2 , perform
all possible instances of raising value
constraints to agreement constraints. - Repeat step 1.
77Seeded Version Space LearningThe Search
- The Seeded Version Space algorithm itself is the
repeated generalization of rules by merging - A merge is successful if the set of sentences
that can correctly be translated with the merged
rule is a superset of the union of sets that can
be translated with the unmerged rules, i.e. check
power of rule - Merge until no more successful merges