I am pleased to report that I have just passed another Microsoft course, this time from the Microsoft Professional Program for Artificial Intelligence:
Introduction to Artificial Intelligence, with a final mark of 100%.
This was a fascinating course, providing a very good introduction to machine learning, text analysis, computer vision (including face recognition and video analysis) and conversation as a platform (chatbots and Natural Language Processing [NLP]).
One example of how #AI can make it easier for your staff, customers or suppliers to interact with your software tools is to add a combined”Next Step / Tell me what you want to do” facility.
This uses natural language processing (NLP) combined with knowledge of who the user is (and what their role is, e.g. whether they are a member of staff, a customer, or a supplier, or a user with admin rights for example) and the context (which page or part of the app they are on, and what data they have stored in the system), to add two powerful new ways for the user to interact (with minimal training) with the app:
What’s my next step?
On any page, simply clicking the Go button asks the system “What’s my next step?”. The system then look intelligently at the user’s identity, role, data and location within the app and makes one or more suggestions as to what the user could usefully do next to make the most of the app.
Here are a couple of examples, taken from InQA’s WebPocketMoney application (referred to in this previous post).
AI built in to the heart of user interfaces
Within a few short years, some companies and organisations will have adopted Artificial Intelligence (AI) in at least one part of their work: interfacing with their customers. (I’m using customers in the widest sense of the word: it could be students in education, or patients in healthcare for example).
Imagine the following:
- Instead of having to log in to a website or an application, the application simply recognises the user’s face or voice
- Instead of having to click on a menu to navigate the app, the user can just talk to it, either by speaking or using a chatbot type interface.
- Instead of calling customer service (and being told “you are currently number two in a queue” or “Our business hours are 0900 to 1700 Monday to Friday, please call back during those times” ), they can get an immediate response (24 hours a day, 365 days a year) from a chatbot.
If customers have a choice between interacting with one organisation in that way, or another in the more traditional way, I think they will vote with their feet.
It’s a straightforward matter of economics
Am pleased to report that I have just passed another Microsoft course, this time not directly to do with Big Data or Data Science, but instead relating to user interfaces:
Introduction to ReactJS, with a final mark of 89%.
ReactJS is a modern open source library framework that produces high performance web user interfaces.
As I continue on my #datascience, #bigdata and #ai journey, I am pleased to have just completed my 6th course on the Microsoft Professional Program for Big Data with 84%: Delivering a Data Warehouse in the Cloud.
There are 4 more courses left on the program, and I am still on track to complete the program by my target of the end of January 2019.