A tokenizer based on the dictionary and Bigram language models for Go. (Now only support chinese segmentation)
Here you can share your experience with the project you are suggesting or its comparison with gotokenizer. Optional.
A valid email to send you a verification link when necessary or log in.