One example of how #AI can make it easier for your staff, customers or suppliers to interact with your software tools is to add a combined”Next Step / Tell me what you want to do” facility.
This uses natural language processing (NLP) combined with knowledge of who the user is (and what their role is, e.g. whether they are a member of staff, a customer, or a supplier, or a user with admin rights for example) and the context (which page or part of the app they are on, and what data they have stored in the system), to add two powerful new ways for the user to interact (with minimal training) with the app:
What’s my next step?
On any page, simply clicking the Go button asks the system “What’s my next step?”. The system then look intelligently at the user’s identity, role, data and location within the app and makes one or more suggestions as to what the user could usefully do next to make the most of the app.
Here are a couple of examples, taken from InQA’s WebPocketMoney application (referred to in this previous post).
Example 1: a new user has just registered and wonders what they should do. They could consult the online help file which will tell them that they need to register their family within the system. But far more simply, then can just ask “What’s my next step?” by clicking the Go button. The system guides them step by step, telling them initially that they need to create one or more families in the system:
When the user clicks OK, the system helps them create a family record.
Example 2: here (in InQA’s Web Pocket Money app), the user has entered details of their family members, added weekly pocket money amounts and notional accounts (current and savings) for each child. (They could also have got this far by using the traditional menu navigation system supplied, but far easier is by asking “What’s my next step” repeatedly).
They now click the Go button, and the system suggests that they add one or more rewards for each child member of the family:
Tell me what you want to do
The user can instead tell the app what s/he wants to do, by starting to type within the text box. As they type, the system uses auto-complete to suggest suitable actions for the user (again, looking intelligently at the user’s identity, role, data and location within the app).
In the above, the user can select from the suggested options, modify them, or ask for something completely different. The #AI behind the system will (using the full context of the request available to it) provide an intelligent response to the user’s request.
As with chatbots, the app is becoming closer to a colleague that you can ask for help, and interact with using natural language.
(And sometimes its response will be “sorry, I don’t know, but I will find out!” – but that happens with human colleagues too!).