gov.llnl.ontology.text.tokenize
Class OpenNlpMETokenizer

java.lang.Object
  extended by gov.llnl.ontology.text.tokenize.TokenizerAdaptor
      extended by gov.llnl.ontology.text.tokenize.OpenNlpMETokenizer
All Implemented Interfaces:
opennlp.tools.tokenize.Tokenizer

public class OpenNlpMETokenizer
extends TokenizerAdaptor

A wrapper around the TokenizerME Tokenizer so that it can be loaded with a no argument constructor using a predefined model.

Author:
Keith Stevens

Field Summary
static String DEFAULT_MODEL
           
 
Fields inherited from class gov.llnl.ontology.text.tokenize.TokenizerAdaptor
tokenizer
 
Constructor Summary
OpenNlpMETokenizer()
          Loads the model configuration from DEFAULT_MODEL
OpenNlpMETokenizer(String modelPath, boolean loadFromJar)
          Loads a TokenizerME model from modelPath.
 
Method Summary
static opennlp.tools.tokenize.Tokenizer loadModel(String modelPath, boolean loadFromJar)
          Returns a Tokenizer stored in the file specified by modelPath.
 
Methods inherited from class gov.llnl.ontology.text.tokenize.TokenizerAdaptor
tokenize, tokenizePos
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Field Detail

DEFAULT_MODEL

public static final String DEFAULT_MODEL
See Also:
Constant Field Values
Constructor Detail

OpenNlpMETokenizer

public OpenNlpMETokenizer()
Loads the model configuration from DEFAULT_MODEL


OpenNlpMETokenizer

public OpenNlpMETokenizer(String modelPath,
                          boolean loadFromJar)
Loads a TokenizerME model from modelPath. If loadFromJar is true, the binary file will found within the running class path.

Method Detail

loadModel

public static opennlp.tools.tokenize.Tokenizer loadModel(String modelPath,
                                                         boolean loadFromJar)
Returns a Tokenizer stored in the file specified by modelPath. This is static so that the constructor can simply pass the loaded Tokenizer to the constructor of the TokenizerAdaptor.



Copyright © 2010-2011. All Rights Reserved.