In 1995 my 11 year old mind was blown when I could control Windows 3.11 by speaking commands into a microphone on my desk. It was one of the coolest things I had ever seen! But it was far from practical.
24 years later I find myself speaking in a natural manner to at least 5 different devices in my household. Sometimes they misunderstand me and need some manual help. But they’ve come a long way from “Left, Down, Enter”. It is thanks to the advancements in ASR, NLP and AI, the usage of Voice Input is widespread, and growing. And most of the time, making our lives more convenient.
But what about our working lives?
In September 2018, Salesforce announced Einstein Voice. And my inner 11 year old is ready to get his mind blown again. At Zero Keyboard it’s been our mission to reduce the friction of working with CRM. So naturally we’re interested in how this will assist bringing the world closer to that goal.
We took a close look at all the demo videos. Here are a few things that came to mind.
“I’m sorry, I didn’t quite catch that”
While not a native speaker, I’m told that my English is excellent. But from time to time, my Google Assistant insists that I wanted to set a timer for forty minutes, instead of thirty. So I end up cancelling the command, trying again or setting the timer myself. It turns out that I’m not the only one. Non-native speakers are experiencing 30% more inaccuracies with voice recognition technologies.
The Salesforce client base includes the Americas, Europe and APAC. And there are approximately 160 English dialects throughout the world.
Spoken meeting notes most often contain (localized) names, dates, numbers. But depending on the organization, could include custom fields and objects as well.
So the big question is: How will Einstein Voice perform with dialects?
Will your meeting notes need extensive editing afterwards? Do dates and numbers need adjustments? A manual review of the results slows the process down, but is necessary. The last thing you want to do is enter incorrect data.
Sometimes it’s easier to point at something.
Restaurant menus in Tokyo are a prime example! Even though I have studied Japanese for Business, it was a huge relief to be able to point at a picture of the dish I wanted.
Touchscreens work much in the same way. Rather than typing a command or moving a mouse cursor, we do what we’ve done since we were babies. Point and touch.
“I want that one”. Easy to understand.
Standard fields and objects will be easy for Salesforce Einstein to update. But would you rather spell out “T-65C-A2” or touch a picture of an X-Wing Starfighter on your mobile screen?
Some things are better left unsaid.
Vocalizing business sensitive information in public is rarely a good idea. This means you will need to find the time and (a quiet) place to talk to Salesforce Einstein.
But once you do open up to Einstein, who else is listening?
You need to teach a machine in order for it to learn. Google, Apple and Amazon all have manual review processes in place to train their systems. So whenever a command is not understood, a human will listen to it and tell the system what is was. It goes to show that even the biggest tech companies need to rely on humans to work their ‘computer magic’.
Salesforce and its tech providers are no exception.
Each of these organizations have taken care to protect their users. Data is anonymized and policies prevent any sharing or long term storage. But do you want anybody to know about that highly confidential deal you’re moving forward?
Get a mix that works for you
So will your sales floor soon be reverberating with Einstein commands? Probably not. But some of your salespeople might love working with it.
For a good mobile CRM experience, make sure that you enable a mix of technologies. This ensures that you empower your entire team to get the most out of it. Touch screens still have plenty to offer for mobile sales teams! Tools like Zero Keyboard remove the stress of data entry and keep the focus on the right activity.
And when in doubt, just ask your team what they think works for them.