Dragon Racer Mac OS

Can you install Mac OS X on an Apple TV (first-gen)? Mac OS X runs on Intel, and the Apple TV uses an Intel Pentium M, so it's gotta be possible, right? The dragon mace is the fifth-strongest mace stat-wise after the Tzhaar-ket-em. It can only be wielded by players who have an Attack level of at least 60 and who have completed the Heroes' Quest. The mace can be purchased for 50,000 coins at the Happy Heroes 'H' Emporium in.

In October 2018, Nuance announced that it has discontinued Dragon Professional Individual for Mac and will support it for only 90 days from activation in the US or 180 days in the rest of the world. The continuous speech-to-text software was widely considered to be the gold standard for speech recognition, and Nuance continues to develop and sell the Windows versions of Dragon Home, Dragon Professional Individual, and various profession-specific solutions.

This move is a blow to professional users—such as doctors, lawyers, and law enforcement—who depended on Dragon for dictating to their Macs, but the community most significantly affected are those who can control their Macs only with their voices.

What about Apple’s built-in accessibility solutions? macOS does support voice dictation, although my experience is that it’s not even as good as dictation in iOS, much less Dragon Professional Individual. Some level of voice control of the Mac is also available via Dictation Commands, but again, it’s not as powerful as what was available from Dragon Professional Individual.

TidBITS reader Todd Scheresky is a software engineer who relies on Dragon Professional Individual for his work because he’s a quadriplegic and has no use of his arms. He has suggested several ways that Apple needs to improve macOS speech recognition to make it a viable alternative to Dragon Professional Individual:

  • Support for user-added custom words: Every profession has its own terminology and jargon, which is part of why there are legal, medical, and law enforcement versions of Dragon for Windows. Scheresky isn’t asking Apple to provide such custom vocabularies, but he needs to be able to add custom words to the vocabulary to carry out his work.
  • Support for speaker-dependent continuous speech recognition: Currently, macOS’s speech recognition is speaker-independent, which means that it works pretty well for everyone. But Scheresky believes it needs to become speaker-dependent, so it can learn from your corrections to improve recognition accuracy. Also, Apple’s speech recognition isn’t continuous—it works for only a few minutes before stopping and needing to be reinvoked.
  • Support for cursor positioning and mouse button events: Although Scheresky acknowledges that macOS’s Dictation Commands are pretty good and provide decent support for text cursor positioning, macOS has nothing like Nuance’s MouseGrid, which divides the screen into a 3-by-3 grid and enables the user to zoom in to a grid coordinate, then displaying another 3-by-3 grid to continue zooming. Nor does Apple have anything like Nuance’s mouse commands for moving and clicking the mouse pointer.

When Scheresky complained to Apple’s accessibility team about macOS’s limitations, they suggested the Switch Control feature, which enables users to move the pointer (along with other actions) by clicking a switch. He talks about this in a video.

Unfortunately, although Switch Control would let Scheresky control a Mac using a sip-and-puff switch or a head switch, such solutions would be both far slower than voice and a literal pain in the neck. There are some better alternatives for mouse pointer positioning:

  • Dedicated software, in the form of a $35 app called iTracker.
  • An off-the-shelf hack using Keyboard Maestro and Automator.
  • An expensive head-mounted pointing device, although the SmartNav is $600 and the HeadMouse Nano and TrackerPro are both about $1000. It’s also not clear how well they interface with current versions of macOS.

Regardless, if Apple enhanced macOS’s voice recognition in the ways Scheresky suggests, it would become significantly more useful and would give users with physical limitations significantly more control over their Macs… and their lives. If you’d like to help, Scheresky suggests submitting feature request feedback to Apple with text along the following lines (feel free to copy and paste it):

Because Nuance has discontinued Dragon Professional Individual for Mac, it is becoming difficult for disabled users to use the Mac. Please enhance macOS speech recognition to support user-added custom words, speaker-dependent continuous speech recognition that learns from user corrections to improve accuracy, and cursor positioning and mouse button events.

Thank you for your consideration!

Thanks for encouraging Apple to bring macOS’s accessibility features up to the level necessary to provide an alternative to Dragon Professional Individual for Mac. Such improvements will help both those who face physical challenges to using the Mac and those for whom dictation is a professional necessity.

One of the features in OS X Mavericks that I was most looking forward to was offline dictation.

Back in OS X Mountain Lion, Apple added the systemwide Dictation tool, similar to Siri in iOS. You pressed a key combination (by default, the Fn key twice) and started talking to your Mac, and it recorded and transcribed what you said. But this feature required an Internet connection and worked for only brief periods of time—about 30 seconds—before your Mac stopped listening to your speech and headed off to Apple’s servers to have your words transcribed.

Dragon racer mac os update

My biggest complaint about this implementation was that it didn’t give you any feedback about your dictation until your transcribed text returned to your Mac. If something went wrong, you had no idea until you were (a) done speaking and (b) OS X had finished transcribing what you said.

If something went wrong, you had no idea until you were (a) done speaking and (b) OS X had finished transcribing what you said.

OS X transcription 2.0

That’s no longer the case. In OS X Mavericks, you now have the option of downloading a file that supports offline dictation. To set it up, you go to the Dictation & Speech pane in System Preferences and tick the Use Enhanced Dictation box. That causes the file to download. (Note: It’s a big one—785MB.)

Having this transcription-support file on your Mac dramatically improves the functionality of OS X’s built-in Dictation feature. Now, when you press the Fn key twice and start speaking, the words appear on screen as you speak. The feature works anywhere on the Mac that you can enter text, no training or customization necessary. Just press the key and start talking. In fact, it’s how I’m adding this very text.

Overall, I really like the feature. With my Retina MacBook Pro, the two microphones are so good that I can even dictate without first donning a headset microphone (a traditional requirement for dictation). I find myself using it throughout the operating system and in places that I’d never thought of using dictation before, including online forms and annotations to PDF files. It’s great.

But Mac dictation isn’t new to Mavericks. I’ve been dictating to computers for a long time. (When I first started dictating, you … had … to … talk … like … this … leaving … a … space … between … each … word.) My usual tool is Dragon Dictate for Mac. So when I heard that Apple was improving the Dictation tool in OS X, my first question was: How will it compare to Dragon?

Dragon Racer Mac Os X

When I heard that Apple was improving the Dictation tool in OS X, my first question was: How will it compare to Dragon?

(Note that, while Apple has never stated publicly where it got the technology behind Siri dictation, I strongly suspect it is Nuance, the same company that publishes Dragon Dictate.)

And so I decided to put the two dictation systems to the test. I took a single passage of text and read it aloud to my Mac, first using Mavericks’s built-in Dictation tool and then using Dragon’s. The differences were striking.

Putting them to the test

Just using the two products is a different experience. Dictation software doesn’t understand speech the same way humans do. We continually and instantaneously parse the words we hear based on context; that’s how we know the difference between “ice cream” and “I scream.” Computers do much the same thing, but they aren’t as good at it.

Dragon Racer Mac Os Catalina

What this means is that, in Mavericks’s Dictation system, words appear on the screen as I speak them, but in a disjointed way, as the system tries to figure out what I’m saying. The words themselves and their order change as I get deeper into a sentence; things keep switching around. Sometimes the screen gets so jumpy that it’s distracting. Dragon Dictate doesn’t put words on the screen as fast as Mavericks’s Dictation, but the words it does put up are usually closer to the final transcription than in Dictation.

The real test, however, is accuracy. To assess that, I used both the Mavericks Dictation tool and Dragon Dictate to transcribe a four-paragraph, 268-word passage of text. I ran through the passage three times in Mavericks, to iron out some kinks, and just once in Dragon Dictate. I didn’t use my existing user profile in Dragon Dictate, in an attempt to make the playing field even.

The results? Both programs made mistakes. Mavericks Dictation’s errors were more frequent and more ridiculous, however. For instance, when I said “detail,” it transcribed “D tell.” When I said “expository,” it heard “Expo is a Tory.” The program had particular problems with the sentence “Students must be jarred out of this approach.” I spent several minutes trying to get Dictation to transcribe “jarred” and “jar” correctly; each time it transcribed them both as “John.” I also found it odd that Dictation refused to insert a space before opening quotation marks; it failed to do so in every instance of my test.

In the end, Mavericks’s built-in Dictation tool made 28 mistakes.

Dragon Dictate had fewer problems but still made some mistakes of its own. It too tripped on “expository,” but less hilariously than Dictation, writing “expositors” instead. It insisted on transcribing “class scored” as “classic lord.” Overall, it made nine mistakes.

Mac

So the final accuracy scores were 96.6 percent for Dragon Dictate and 89.6 percent for Mavericks’s Dictation. Although that difference might seem insubstantial, and although Mavericks still got a very high B, if you were to dictate a passage of 10,000 words, the text would have more than 1000 errors if you used Mavericks’s Dictation tool, versus about a third of that in Dragon Dictate.

The bottom line

This result isn’t so surprising. Dragon Dictate is a paid application with several years’ worth of development effort behind it. Also, Dragon Dictate requires you to spend time training it before it will even work, so it has a much better idea of your voice and the way in which you speak.

In addition to increased accuracy, Dragon Dictate has the ability to learn words you use often, and nearly always handles proper names better than the Mavericks Dictation tool. Dragon Dictate also has several additional features for controlling the user interface that are simply not available with the Dictation module in Mavericks.

In other words, Dragon Dictate is a fully developed, feature-rich product; Mavericks’s Dictation, not so much. Then again, Dragon Dictate costs $200, while the Mavericks tool is free.

The way I see it, Mavericks’s Dictation tool is like Dragon Dictate Lite. Nevertheless, I’m finding use for both of them. The Mavericks tool’s best feature is the ability to activate it anywhere on my Mac and immediately start dictating; I’m using it in all sorts of unexpected places on my Mac. Dragon Dictate is not as easy to get working in any context, but when you need to dictate long passages of text, its increased accuracy makes it the clear choice.

Read our full Dragon Dictate for Mac 3 review