AI learns sign language by itself

The software designed by scientists at the University of Oxford, enables the computer program to learn sign language just by watching hand gestures and subtitles displayed on TV. The algorithm first recognizes the gestures made by human signer. The software determines the rough location of hands by making use of arms and then identifies the flesh coloured pixels to determine precise hand shapes. The program then analyses signs that are accompanied by words on TV and slowly produces its database for the learned movements. At the end of the experiment, the software was able to correctly identify 65% of the keywords. The researchers say that given the complexity of the task, which may involve one word appearing in different contexts, therefore having different signs, this is a high success rate.