I am training wit.ai
understanding, I use python
script to make api call and feed wit.ai
with sentences stored in a local file.
There is more data (~thousands) than I can manually validate (hundreds).
Does wit.ai
prioritize what it ask me to validate? e.g. those that has low confidence score or "no match"
From my experience, I did not see this kind of optimization for wit.ai
. If this is true, I need to do some optimization in my training process, and avoid flooding wit.ai
with similar data that reach high confidence score quickly.