Have you used voice commands on your smartphone lately — Siri or Google Assistant? They’ve been on phones for years but most of us don’t bother with them extensively.
But voice-to-text is improving quickly — I already find it very useful. And it’s evolving again. Soon voice assistants will be networked across the net. You’ll be able to speak to your phone and ask Google Assistant to connect to another voice assistant, a bot belonging to, say, a hotel chain, and talk to that assistant to book a room. It’s around the corner.
At a demonstration at its Google I/O conference last month, a user asked Google Assistant to contact the Panera service, where she ordered goods, confirmed the delivery address and paid for it, all by voice.
Apple says that with machine learning, Siri has halved the voice recognition error rate. It’s now an efficient use of time to text, dictate and issue commands to Siri on an iPhone. I find that if the text translation includes mistakes and I correct them, that still takes less time than typing.
On iPhone and iPad, you say “Hey Siri” or long press the home key. If you’re dictating, press the microphone key to the left of the space bar. I sometimes dictate directly into Pages or the Google Docs app on an iPhone while on the go.
On an Android phone, to activate the Assistant, say “OK, Google” or “Hey Google”, or long press the home key. I fire up Google Docs on an Android phone, open a document, select the microphone on the keyboard (you may have to delve to find it) and dictate straight into the document. When I open Docs on a work PC, the text file is there without any need to transfer it.
Voice is especially powerful for getting a phone or smartwatch to perform tasks. It could be to turn on a light, set a calendar entry, or a reminder to pay the rent that runs every week — all created by voice. Here are some useful ideas in each ecosystem.
Apple Siri
Has your Reminders app been sitting on your iPhone unused for umpteen years? It’s time to resurrect it. With voice, it’s powerful. You can mutter a reminder to Siri while an idea is in your head, before you forget it.
I set reminders all the time. It could be to check the mail before I leave home, to thaw meat in the fridge at 4pm, to pay a bill tomorrow or return a library book in four days. Anything I could forget.
You could say “remind me to pick up the dry cleaning in two hours” or “pick up my daughter every Thursday at 5pm”.
You can also create specific lists. If you open the reminders app and create a list called “shopping”, you can say “add milk to my shopping list”, and when you’re later at the supermarket you can say “show my shopping list”, and Siri will read it out.
Reminders don’t have to be time specific; they can be location based. You can say “remind me to call my sister when I get home” or “remind me to collect mail when I get to work” or “remind me to buy batteries when I get to Bunnings Mascot”. The location can be a business or a contact.
With calendar events, you can set up all kinds of meetings and use everyday language to define the time. If I say, “schedule a meeting with David next Tuesday at 2pm”, it will ask me which David in my contacts app do I mean. Siri reads out and checks the prospective calendar event before saving it.
You can schedule recurrent events too. You can say “schedule a recurrent event every Thursday at 9am called group meeting”.
Siri reminds you with a notification on your phone. She doesn’t speak the reminder aloud, so you could miss it. To avoid this, go to settings/sound and set a reminder tone you won’t miss. I use “cosmic”. The reminder can appear on an Apple Watch if your phone is locked.
Apple Watch becomes more usable when operated by voice. For example, on my way to a Sydney railway station in the mornings, I glance at the TripView app on my wrist to see how long I have. I don’t find it in the menus, I just say “open the TripView app” to Siri on the watch.
I find it useful to let Siri search emails for what I want. I can say “show me emails about Bitcoin” and it will keyword search for Bitcoin and read out matching emails. My one irritation is that Siri spells out web addresses and non-alphabetical characters, which gets very tedious. Apple needs to work on this.
If you ask Siri to “show me the nearest restaurant”, or “show me five close Japanese restaurants”, it will offer details, give general pricing and a star rating, and offer to call it or give you directions. The details it offers vary with location, though.
Hands-free calling is very useful. If I say “Hey Siri, call mum’s mobile on speaker”, Siri will make the call and select speakerphone.
You can ask Siri to “navigate to central Sydney by public transport”, or to “open general settings”, or “post on Facebook I’m feeling great today” to update your status, “disable mobile data”, or “convert $US5 to Australian dollars”.
Siri also performs arithmetic, currency and time zone conversions, searches photos, tells you stock prices and the movies playing tonight — there are innumerable things to try.
Google Assistant
Google has made huge strides in incorporating artificial intelligence and machine learning into Google Assistant. Speech recognition is more accurate and the assistant’s voice sounds natural.
Google has a big vision of Assistant being part of a voice-connected world where it links to a retail bot assistant at a shop and you can talk your way through a purchase.
Basic functionality with Google Assistant has improved. You can open an app by saying “open Facebook”, or a web page by saying “go to www.theaustralian.com.au”. You can ask the assistant to “search for laptops on eBay”.
You can say “call Adam”, “set an alarm for 20 minutes”, “add milk to my shopping list” or “create a calendar event for 4pm tomorrow”.
Emailing is easy. Say something like: “Email Chris subject today message What are you doing today?” You can add a CC and BCC, and if there is more than one Chris in your contacts the assistant asks you which one. If the recipient has several email addresses, it will ask you which to use. You confirm the final email before sending.
If you just say “email Chris”, the assistant steps you through missing details. It will ask which Chris and what the message is. It engages you in a conversation to work out what you need.
Machine learning applies to your photos, too. It recognises objects even when they’re not captioned. You could ask “show me photos of Sydney with a pool”.
Google Assistant interprets a small pause as the end of your instruction, so spit it out in one go. Read back any voice-generated message before sending it.
Information requests are context sensitive, so if I say: “How old is Paul McCartney?” and then “What is his latest album?”, Assistant responds with “In the 80s”. It knows I mean McCartney’s album.
Assistant does restaurants differently to Siri. If I ask “find the nearest five Japanese restaurants”, it will show them on screen. If you manually click the restaurant link you get options to call, directions, street view, and where available, the menu. That’s a menu in one voice command and two clicks.
Reader comments on this site are moderated before publication to promote lively and civil debate. We encourage your comments but submitting one does not guarantee publication. We publish hundreds of comments daily, and if a comment is rejected it is likely because it does not meet with our comment guidelines, which you can read here. No correspondence will be entered into if a comment is declined.