Your phone's keyboard might not be the most obvious candidate for software that uses artificial intelligence, but most of the keyboard apps on Android and iOS actually use a lot of 'brain-power' to try and predict what you're going to type next. SwiftKey, which has popular keyboards for Android and iOS both, appears to be taking this brain-power analogy quite literally however, and on Thursday launched a new app called Neural Alpha on Google Play, which it claims uses artificial neural networks to predict and correct your typing.
A neural network is a computing term which - in highly simplified terms - refers to a computer that works in the same way as the human brain does. It refers to the computer's ability to piece together the different information it is given, and then decide which parts are important, and thus "learn", by comparing patterns and building an intelligence over time.
In the case of SwiftKey's Neural Alpha, the experimental app uses neural nets to offer you smarter and more meaningful suggestions in the context of what you're writing. The neural network offers a learning model - where the app learns how you use different words, and builds up its vocabulary in the process. It does this through identifying patterns, and learning how these patterns overlap and intersect - the basic concept is not so different from what other companies are doing in terms of teaching computers to recognise images.
In the case of Neural Alpha, the app can "understand" word similarity, and analyse the longer sentence context instead of only suggesting words based on the last word entered. According to SwiftKey, the app also learns to understand complex word relationships, to make the predictions more accurate.
Of course, this kind of brain-power requires serious computing power as well, and that's why Neural Alpha is accelerated by the Android device's graphics processing unit (GPU), where possible. SwiftKey also notes on Google Play that the use of neural networks is computationally challenging and intended for more powerful smartphones.
There's a more detailed explanation on the SwiftKey blog.
Essentially, SwiftKey uses something it calls 'word sequence n-gram'. This technology provides accurate predictions for common phrases and phrases learned from users, but doesn't actually capture the meaning of the words. As a result, it can only accurately predict words when it has seen them being used in the same sequence.
SwiftKey's Neural Alpha on the other hand is parsing the sentence for meaning, to provide more "human" suggestions. According to the blog, it understands word similarity, allowing it to compare words on the fly. Within the neural model, words can be visualized in 'clusters', located at varying degrees of proximity to one another.
For example, having seen the phrase "Let's meet at the airport" during training, the technology is able to infer that "office" or "hotel" are similar words which could also be appropriate predictions in place of "airport". Further, it understands that "Let's meet at the airport" has a similar sentence structure to "Let's chat at the office". This intelligence allows SwiftKey Neural Alpha to offer the most appropriate word based on the sentence being typed.
The question is how successfully SwiftKey will be able to bring this kind of computing to a small, discrete phone, as opposed to a huge server farm. The 25MB app clearly needs more optimisation for now, which is why it's not been released as a regular update to the main SwiftKey app, but if it is able to deliver over time on the promise that's outlined in SwiftKey's blog, then this is going to be a big step forward.