Starkey Edge AI: The Future of Hearing is Here, and I've Tested It
With the October 2024 release of the Starkey Edge AI, it’s hard to believe that Starkey is now on their fourth generation hearing aid that uses deep neural network (DNN) technology. In a nod to Starkey’s first hearing aid utilizing deep neural network technology, 2020’s Livio Edge AI, this time, Starkey is making it clear that their DNN-powered ‘Edge Mode’ is not just a feature; it’s the star of the show.
And for good reason. The advancements driven by Starkey's DNN are responsible for the most impactful improvements in this release. That’s not to discount the other improvements Starkey’s made with Edge AI—they’re substantial. Starkey has jam-packed an insane amount of tech in the Edge AI hearing aids, and they’ve done it, while maintaining the industry’s best battery life, up to 51 hrs. You can read the full press release here.
New G2 Neuro Processor
Starkey’s 2023 Genesis AI model was powered by a new chip which Starkey called Neuro Processor. This chip was Starkey’s smallest and most powerful ever, with 6x more transistors and 4x the speed of previous processors.
Starkey says they’ve made it even better, with the 2.0 version—the G2 Neuro Processor—getting a neural processor unit (NPU) added to it.You've probably heard of hardware components like CPUs (Central Processing Units) and GPUs (Graphics Processing Units). NPUs (Neural Processing Units) are like the advanced AI versions of these—specifically designed to handle complex calculations and tasks related to artificial intelligence and machine learning.
Smartphones use NPUs to improve machine learning tasks such as facial recognition. Smart home devices like Amazon Echo use them to enhance voice recognition. Smartwatches use them to handle health tracking features like heart rate monitoring. According to Starkey, Edge AI is the first hearing aid to have an NPU fully integrated into the chip, and this gives the Edge AI unmatched DNN processing capabilities—100x more DNN processing as compared to the Genesis AI.
The result? Starkey claims Edge AI is *30% more accurate at identifying speech. If this is accurate—and we have no reason to doubt it—it indicates a significantly improved signal-to-noise ratio (SNR) for Edge AI hearing aids. This enhancement will make it much easier for users to hear speech in noisy environments, as even minor improvements in SNR can lead to substantial gains in speech recognition.
*We’ve been unable to find the evidence supporting this, but will link to it and publish more information here once it is available.
Next-Level Connectivity with Bluetooth Low Energy (LE) Audio
Starkey—and the whole industry for that matter— is finally switching away from the Made for iPhone (MFi) and Audio Streaming for Hearing Aids (ASHA) protocols. If Edge AI is going to be your first hearing aid, that probably won't mean anything to you. For those of you who’ve used hearing aids in the last 10 years and tried to get them to work with your phone, I will give you a moment to scream for joy. I won’t get into all the drawbacks of the MFi and ASHA protocols here—those are well documented across the web by angry hearing aid wearers.
Bluetooth LE Audio promises a lot of things to hearing aid wearers: broader connectivity to more devices, better streaming sound quality, less latency, wider Bluetooth range, better battery life. But unfortunately, for the vast majority of people, none of that matters yet, because there are very few phones that use Bluetooth LE Audio— it’s going to take a few years for all manufacturers to ship their tech with this Bluetooth protocol.
Devices that don’t have Bluetooth LE Audio will still be able to pair up to Edge AI hearing aids via the MFi and ASHA protocols—with reduced functionality as compared to Bluetooth LE Audio devices—and you can consult Starkey’s compatibility checklist to see which features you’ll be missing. Word of warning though, I found an inaccuracy in the above list. According to Starkey’s list, my iPhone 13 Pro Max should have Bluetooth LE Audio and be able to use that protocol to connect to the Edge AI. But my phone does not in fact have Bluetooth LE Audio, and I had to pair up to Edge AI hearing aids using the MFi protocol (more on that below in my review section). Rather than relying on that list, it’s probably more reliable to look up the Bluetooth specs for your exact phone model.
A minute ago I mentioned Auracast—let’s get into what that means.
Edge AI Are Auracast™ Ready
As of now—October 2024—this isn’t a feature that many people will be able to take advantage of; it’s more about ensuring that Edge AI is future-proof. Auracast is a feature of Bluetooth LE Audio that allows audio to be streamed to multiple users’ listening devices—hearing aids or headphones—simultaneously.
Here’s an example of how Auracast will be used in the coming years:
A lot of gyms have rows of televisions above a cardio section. In the future, you’ll be able to use your phone to Bluetooth pair to the gym TV’s Auracast transmitter, and stream that audio to your hearing aids—along with anyone else in the room. Churches, airports, and most public venues will install Auracast transmitters, as this is going to be a broadly used Bluetooth feature by everyone—not just hearing aid wearers.
This technology is ready for use in Edge AI hearing aids, but there aren’t many Auracast-compatible transmitting devices available on the market yet. We expect to see many more Auracast devices in the coming months and throughout 2025.
Apple Watch Support
Included with Edge AI is Apple Watch support as an extension of the My Starkey app. This is a nice new feature, but “extension” is the key word here—you’re not going to get a ton of functionality out of this. With an Apple Watch running iOS 9 or higher, you’ll be able to adjust the volume, programs, and control some of the Edge Mode + settings. I don’t own an Apple Watch so I couldn’t test this, but I imagine some users may find this functionality useful.
Integrated Sensors to Perform a Self-Guided Balance Assessment Exercise
Back in 2018 Starkey was the first company to integrate motion sensors into their hearing aids. These sensors detect movement, orientation, and changes in speed. These are the same types of sensors found in smartphones for screen rotation, and fitness trackers for tracking steps and sleep. Integrating these sensors in 2018 allowed Starkey to add a range of health and wellness features to their hearing aids, including step tracking and fall detection.
This holistic approach raises an important question: why does Starkey have features like step tracking and fall detection? It’s all part of a push to transform their hearing aids into Healthable™ devices, which not only help with hearing but also contribute to overall health and wellbeing.
For instance, a 2012 study suggests a significant association between hearing loss and increased risk falls among older adults. With that in mind, Starkey added a fall detection feature to their 2018 Livio AI model, which could send an automatic alert to pre-selected contacts, such as family members or caregivers.
Fall detection and alerts are nice, but once someone’s fallen, it’s already too late
…and that’s what this new feature is all about—preventing the fall in the first place.
Edge AI are the world’s first and only hearing aids that can perform a self-guided balance assessment, helping wearers identify if they’re at an increased risk of falls, and hopefully proactively manage that risk before a fall happens.
We’re waiting for published details on this feature, but Starkey says they’ve partnered with Stanford University and that the assessment follows CDC protocol.
New StarLink Edge TV Streamer, 60% Smaller Than Previous Version
This is a nice update that you hopefully won’t need!
We find that most of our customers simply no longer have trouble hearing the TV once they’re fit with new hearing aids, so a streamer like this usually isn’t necessary, but there are still a few potential use cases:
- If you have severe-to-profound hearing loss, you’ll probably prefer bypassing your hearing aid microphones and streaming the TV directly to your Edge AI hearing aids. It’s going to be much easier to hear the TV clearly.
- If you want to watch TV while someone else in the room doesn’t want to hear it—maybe at night—you can stream directly to your hearing aids and they won’t hear it.
This TV streamer is Auracast-compatible, so if anyone else in your household uses Edge AI hearing aids—or any other brand’s Auracast-compatible hearing aids for that matter—they can also stream from this device. Or, if you consider scenario 2 above, maybe someone else in your household wants to stream the TV at night to Auracast-compatible headphones while you sleep, they can do that with this streamer as well. I ordered this device to test it out and have some comments on it below.
My Starkey Edge AI Review
For this review I ordered a pair of Edge AI 24 (that’s the best you can get), in the Chestnut color, because it would match my hair the best. You can get Edge AI in 6 different styles or form factors. I chose the mRIC R model, because in Starkey’s previous generation model, Genesis AI, the mRIC R model was the most popular with our customers, so I figured that’d be what most people would want to read about. I also just had to see and play with a pair of these mRIC R units because they truly occupy a unique place in the industry when you consider all the tech onboard, jam-packed in the small size.
I also ordered the new Starlink Edge TV streamer, because I’d never used an Auracast device and wanted to see how that all worked. I’ll write about that a bit more below.
Fit & Feel of Edge AI
Dimensions (LxWxD): 26.6mm x 7.4mm x 12.8mm
Weight: 2.30g
By volume, Edge AI is one of the smallest prescription hearing aids on the market, and be weight, it is the smallest prescription hearing aid (at least the smallest of aids that are rechargeable, Bluetooth, and with a deep neural network—which are all in-demand feature sets).
Honestly, it’s a dream. Having worn every hearing aid on the market I can tell you, hands down—and there is no close second—this is the most comfortable, feature-rich, and low profile hearing aid on the market.
No one noticed them
The full week prior to wearing these hearing aids, I wore the newest and most comparable aid to Edge AI, Phonak Audéo Sphere Infinio. The Phonak aids are over 100% larger than these Edge AI hearing aids, and within minutes, my kids had noticed I was wearing the Phonak devices. For days on end, no one noticed the Edge AI hearing aids in my ear—not even my wife.
I wanted to put these aids through their places so I went to the noisiest restaurant in town that I could think of—I’ll talk about performance in noise a bit below. I talked to the person seated very close to me at the bar for 30 minutes and it was only at the end of the conversation that I pointed them out on my ears, and he’d had no clue I was wearing them.
Rather than trying to hide them and feeling self-conscious about wearing them, I think you’re going to have a new-found confidence hearing speech well in noisy places, and you may just want to point them out, and all the things they do. But if cosmetics is an important aspect to you, I think your search for the right hearing aid should probably start and end with the Edge AI mRIC R.
How I Programmed my Edge AI
I have to tip my hat to Starkey on this one. I hadn’t programmed a Starkey hearing aid in years, and they’d totally revamped the Pro Fit fitting software that controls the hearing aids. The hearing aids connected to the fitting software much faster than competitor devices do, and the programming was a breeze—much shorter learning curve than competitor devices.
How does that benefit you? It means you can be more confident that your provider is going to set these hearing aids up correctly for you, even if they have little experience programming Edge AI. It also means your provider is going to be able to spend more of their time counseling you on using the hearing aids—which is where their time really should be spent.
I like to keep it simple
I program every hearing aid I try the same way—I keep it in ‘Automatic’ mode as much as possible. That means that for 99% of the time I’m using the hearing aid, I’m letting it do its thing. In other words, the hearing aid is determining—on its own, from within the automatic/Personal program—which settings to activate.
If you’re not familiar with hearing aid programs, you can think of them as specific sound settings that activate in a hearing aid, depending on your environment. So when you’re in a restaurant, you could manually activate the ‘restaurant’ program and that would optimize the way the hearing aid sounds for your environment. Edge AI 24 hearing aids have 10 programs + Edge Mode (which I’ll discuss in a bit).
With Edge AI—and with all hearing aids—you can manually activate or “switch into” a program if you’d like, but I really prefer not to do that. These hearing aids use machine learning and sophisticated algorithms to determine your environment and adjust on their own, and they do a good job at doing so. My philosophy is if I’m paying for high end tech, I’m gonna let it do its thing. And frankly, I want minimal engagement with my hearing aids anyway. Your hearing provider will help you determine which programs to use, and whether you use more of an automatic approach like I do, a manual approach, or a combination of both.
Your provider will also help you set up various user controls. You can control Edge AI hearing aids via the My Starkey app, the button(s) on the hearing aids, and via the StarLink Remote Control 2.0 (shown further down this page). I’ve actually never programmed a hearing aid with as many user control options as Edge AI has.
Here is a list of things you can potentially control, depending on how your provider configures your hearing aids—and depending on what you want:
- Volume
- Program
- Edge Mode +
- Mute
- Sleep mode
- Offline mode
- Tinnitus therapy levels
- Smart assistant
- Streaming audio start/stop
- Fall alerts
Here’s the thing about the Edge AI mRIC R model that I tried—it’s the smallest, and in my opinion the best looking aid by a long shot, but that does mean it only has one button on it. That means I’m going to need that single button to be able to control several different settings on the hearing aids. A short press should control one setting, and a long press should control another setting. And it does—the short and long press works perfectly. But that only gets me control over 2 of the above 10 options.
That’s where the Tap Control feature comes in—sort of. You know how you can air tap AirPods and Galaxy Buds to start/stop audio streaming? Edge AI hearing aids have a similar feature, and in theory, you should be able to use that control to adjust another setting from the list above. So when I first opened up the user controls in the programming software I thought that between a short press, a long press and hold, and tap control, I’m going to be able to control 3 of the above settings, and maybe I’ll use the My Starkey app to control a couple other settings.
Unfortunately, the tap control feature didn’t work as reliably on Edge AI as it has on other hearing aids I’ve tried. It worked about 20% of the time, which was so infrequent, that I opted to just disable tap control entirely.
So with tap control out of the picture, I decided to use the short press to activate Edge Mode +, and the long press and hold to change the volume. I did that, because my goal when wearing these hearing aids—as with all hearing aids I wear—is to try and keep all the main functionality I’m going to use on the hearing aid itself, so I can use the app as little as possible.
I wish there was a way to turn them off
As far as user controls, the only feature I really missed was a way to turn them off. The only way to turn this model off is to put it in the charger. But what if you’re on-the-go and want to put them in your pocket? If they’re still on, they could potentially feedback (whistle) a little bit, and they’ll still be draining batteries. You’ll see from the list above that there is a “sleep mode” and “mute” option which would’ve fixed that problem, but I would’ve then had to give up control over the other features—volume or Edge Mode +.
A way to turn them off would’ve helped a lot during the pairing process, to my phone, and to the TV streamer. Part of troubleshooting Bluetooth pairing includes turning the aids on and off again, which in this case meant walking across the room, putting them in the charger for a few seconds, and then pulling them out.
I love this minimalist style of hearing aid so much, I just wish I had one more way to control the hearing aid on the aid itself. In 2013, Starkey introduced a model called "Xino" that featured a touch control panel. It would be fantastic to bring back that concept in some form, allowing users to order the Edge AI mRIC R model with multiple control options.
In short— if you’re going to get this small Edge AI mRIC R model, there’s probably no way to escape the fact that from time to time you’re going to have to use the app to control the hearing aids. If you’re fine using the app, none of the above will really matter to you because the app gives you a ton of control over the hearing aids. It’s just my personal preference to not have to use an app.
My Starkey App
I thought the app was great. It’s not well-reviewed in the app stores, with a consistent complaint being connectivity issues, but I never experienced that. The My Starkey app—like all hearing aid apps—can be overwhelming. I know for sure that I didn’t utilize the app to its fullest extent, I have no interest in doing so.
I only ever used the app when I needed it to do something that I couldn’t do from the hearing aid itself, including:
- Edge Mode + settings (more on that below)
- Bluetooth streaming
- Self check
The Self Check feature is great
To my knowledge, Starkey is the only company that has this feature, and it unfortunately came in handy for me on day 2 of my trial. To use this feature, you set the hearing aids down on a table and via the app, run a diagnosis. The diagnosis emits some sounds through the hearing aids, and then gives you a status update on the 3 main components: receiver, microphone, circuit.
On day 2 of this trial, the left hearing aid suddenly just stopped working. It wasn’t amplifying at all, and it wouldn’t stream audio. At first I was bummed and thought I was going to have to send the left aid back to Starkey before I could resume the trial, but then I remembered the Self Check feature.
So I ran it, got the results (above), and then decided to simply pull the receiver (speaker) out of the unit and put it back in again. Voilà! I am guessing that when I was repeatedly pulling the hearing aids in and out of the charger during the TV streamer pairing process (more on that below), I probably made the mistake of grabbing onto the hearing aid via the receiver wire, and loosened the connection a bit.
You can really get into the weeds on the app—I barely scratched the surface in terms of things you can do with it.
Here’s a list of things you can do and control via the My Starkey app—
- Volume control
- Program control
- Edge mode settings
- Custom program creation along with geo-tagging (the program will activate when you’re at a specific location you’ve tagged).
- Find my hearing aids feature
- Battery status indicator
- Audio streaming sound quality settings
- Self check
- Noise reduction control
- Wind reduction control
- Equalizer control (bass/middle/treble)
- Health tracking (steps, exercise time, etc)
- Fall alerts
- Hearing aid usage tracking
- Balance assessment
- Knowledge library
- Reminders
- Language translation
- Transcribe conversations (text appears on the app as you speak)
- Telehear (remote support from your hearing care provider)
- Smart Assistant (I’ll call it “semi-smart”—more on this below)
Warning—some of the above features are not ready for primetime (I’m looking at you ‘Smart Assistant’)—more on this below. If any of the above features are the sole reason you are buying Edge AI, it would be worthwhile to have a candid conversation with your hearing professional about how some of the above features work. The features that are critical for the majority of people—like adjusting how the hearing aids sound—work perfectly. Some of the other features are in their early days and will be refined over time.
Voice Activated, Generative AI-Powered ‘Smart Assistant’
This feature hasn’t gotten as much airtime as the other headlines in this release, and it’s probably because the feature is in its early days. It’s technically called the Smart Assist feature, and it’s been in Starkey hearing aids since 2023. Here’s the official description of this feature from Starkey: ‘Patients can use their voice to adjust settings, set reminders, ask questions about hearing aid topics, and more.’
What’s new is that with the Edge AI model, this feature is powered by Generative AI. When I first heard that, and considered my own experiences with generative AI-powered voice chatbots, I was salivating. If you’re out of the loop, here’s an example of this kind of tech in action.
In reality the feature doesn’t work anywhere near as robustly as you might expect when you read that it’s powered by Generative AI. It did a fine job of following my basic voice commands like raising the volume or changing the memories. But when I used this feature, it only seemed to work when the My Starkey app was already open on my phone, and at that point, I found it quicker and easier to just use the app itself to change the volume or program.
For more complicated queries like checking the weather or other real time information, for now I would definitely prefer just using Siri, which uses a more natural voice and is faster and smarter. So in my opinion, the Smart Assist feature isn’t ready for primetime, but with a little imagination it’s not hard to see where this is headed—and it’s wild, in a good way…I think.
Overall Sound Quality
I’ll preface this by saying, my hearing loss is very mild. It’s normal in the low frequencies, and only drops a bit in the high frequencies—enough for me to struggle with hearing in noisy places like restaurants.
Having said that, and considering that I have fairly normal hearing across the board, these were a bit “much” for me. I heard everything. I understand that’s the whole point, and for many people with an actual hearing loss that impacts their day-to-day life, that is a wonderful quality in a hearing aid. But for my mostly normal ears, it was overwhelming—I felt like I could hear a pin drop upstairs.
Some things just didn’t sound natural
…And they don’t on any hearing aids—at least to me. Doorbells, water running, toilets flushing, electronic chimes and dings. And here’s a strange one I noticed with Edge AI— speech from a car’s navigation system sounded like harmonized voices (like there are 3 or more people saying the same thing in a different tone). None of those issues are deal breakers for me, because these hearing aids are optimized to help you hear speech (and they do that really well), but more on that in a minute.
Performance in Noise and the Edge Mode + Feature Was Outstanding
Before I get into my experience using the Edge Mode + feature and how well this hearing aid works in noise, here’s a brief primer on all this “Edge Mode” and deep neural network stuff.
What’s the big deal about deep neural networks in hearing aids anyways?
Starkey was the first to use a deep neural network in a hearing aid back in 2020. Since then, their two biggest competitors—Oticon and Phonak—have also included their versions of this tech in their own hearing aids. As of now, deep neural networks are the gold standard for helping hearing aids do their most important task: separating speech from noise.
Traditionally, hearing aid sound processing has been handled by a trio of directional microphones, engineer-written algorithms, and environmental detection via machine learning. However, manufacturers have discovered that deep neural networks can mimic the way the brain processes sound, allowing for more effective separation of speech from background noise compared to traditional methods. This results in a better signal-to-noise ratio (SNR), the key metric used to assess a hearing aid's performance in noisy environments. It’s complicated stuff, and if you want to get into the weeds, here’s a good article.
The bottom line is—surprise surprise—artificial intelligence is better at some of this stuff than humans are, and now all hearing aid manufacturers are in a race to figure out how to best use AI in their hearing aids.
What is Edge Mode +?
In short, Edge Mode + is the DNN-powered feature in Edge AI hearing aids that delivers the strongest speech enhancement and noise reduction. Initially introduced in 2020 as "Edge Mode," users could activate it in noisy environments for one-time AI-based sound optimization. In 2023, the Genesis AI model introduced "Edge Mode +," which continuously adapted to changing environments after activation—not just one time. Now in 2024, Edge AI takes it further—its DNN is now always on and provides continuous environmental analysis even without activating Edge Mode +, making manual activation of Edge Mode + less necessary.
Oticon has taken the same approach with their Intent hearing aid—the DNN is always running in the background—it’s part of the sound processing that’s always on. With the Phonak Sphere that I wore last week, that wasn’t the case—you had to manually activate it. The DNN always running means that you’re not going to have to fiddle with Edge AI hearing aids as much as you will with other hearing aids when you get into noise.
How to activate Edge Mode +
While the DNN is always on with Edge AI hearing aids, you still do get some manual control over the DNN—it’s not completely automated. And that’s where Edge Mode + comes in. Edge Mode + is like the DNN sound processing on steroids. This is a similar feature to Oticon’s Speech Booster feature, which just takes the DNN noise processing up a notch—it makes it more aggressive.
If you buy the Edge AI 24 model, you can activate Edge Mode + via the button on the hearing aid or the My Starkey app. If you buy the Edge AI 20 or Edge AI 16 model, you can only activate Edge Mode + via the My Starkey app. For me, this would be a good enough reason to buy the 24 model. I prefer minimal interaction with the app, so I programmed these hearing aids to activate Edge Mode + with a short press of the push button. This setup allowed me to engage the feature effortlessly.
Edge Mode + Works Really, Really Well
Starkey hasn’t published their studies yet, and I’ll include a link and analysis here when they do, but they’re saying Edge AI offers up to a 13 dB signal-to-noise ratio improvement (compared to unaided ears). The highest we’d heard of prior to Edge AI was 12 dB (from Phonak).
I wore Edge AI in the noisiest restaurant that I know of, where (without hearing aids), I would absolutely have to have guessed at a few things the bartender was trying to tell me.
While wearing these, I never had to ask the bartender/server to repeat themselves. The way the Edge AI separates speech from noise is hard to describe—you really have to experience it to understand. The best way I can explain it is that speech feels like it exists on its own separate 'layer' or 'plane,' distinct from all the other background noise.
I wore these hearing aids in their most aggressive speech enhancement mode possible. I am pretty sure that neither Starkey, or any hearing care professional, would recommend someone to use them the way I did. My preferred setting for Edge AI in noise might be different than you prefer to wear it, but here’s what I liked:
First, I’d put the hearing aids in Edge Mode +. Once Edge Mode + is activated, there’s a screen that gives you an option to instruct the hearing aid to go a step further and either ‘Enhance speech’ or ‘Reduce noise’, and that becomes the main priority of the hearing aid. After clicking Enhance speech, I would then (from the same screen), adjust the sound manually, and increase the treble as high as it would go. This makes the consonants of speech clearer—the parts of speech that are easiest to miss in a conversation.
The results were simply mind blowing. Did speech sound natural? No. Not even close. Did I care? No, not at all. I don’t know about you, but priority number one in a noisy environment is just hearing speech clearly. If I can do that, I couldn’t care less about the actual quality of the sound. It is a drastically noticeable difference and it just feels easier to hear— it requires less effort.
So that was how I got the most out of these hearing aids in noise, but as I mentioned a minute ago, the DNN is always on in these hearing aids, even in Personal mode, so even if you don’t want to fuss with activating Edge Mode + and making the adjustments I did, you will still notice an improvement in speech clarity in noisy places, I certainly did.
Why not just keep Edge Mode + on all the time?
Unfortunately you do have to manually activate Edge Mode + anytime you want to use it, but if you’re fine with doing that, you could technically keep it on all the time. It’s not going to drain the battery any faster—but I personally wouldn’t want to leave it on. There’s a reason Starkey (and other manufacturers) make you auto-activate features like this. When you do activate it, the hearing aid is making some compromises.
I noticed with Edge AI that when I used this feature with its most aggressive settings, sound quality definitely took a hit—and that makes sense. When you’re using Edge Mode +, the hearing aid prioritizes speech clarity above all else, and that can mean a less well-rounded or “rich” quality of sound. To me, the voices sounded sort of mechanical—but I think that’s just what it takes to get the voices to clearly stand out from the background noise, because I’ve noticed the same thing with other hearing aids.
I think the way I’d use Edge AI hearing aids in the long term, is I’d leave the hearing aids in ‘Personal’ mode almost all the time, letting Starkey’s automatic signal processing be in charge. On rare occasions when I’d need extra help, that’s when I’d activate Edge Mode +.
Using Bluetooth with Edge AI Hearing Aids
If you’ve read any of my other reviews you may know this—I’m not a huge fan of having Bluetooth-connected hearing aids and probably would never use it if I wore hearing aids full time. If I was buying hearing aids, I’d be buying them to do one job—help me hear better.
Having said that, the Bluetooth on these was a disappointment, but that’s partly my fault—I didn’t do enough research. I had read in the headlines that Edge AI used Bluetooth LE Audio, and I’d read on Starkey’s compatibility list that my iPhone had Bluetooth LE Audio, so I thought I was finally going to get to test out the new Bluetooth protocol with these hearing aids.
I should’ve independently confirmed that the Bluetooth version my iPhone 13 Pro Max was using, because it is definitely not Bluetooth LE Audio like the list above suggests. Turns out, very few smartphones actually use Bluetooth LE Audio right now—there’s a few Google Pixel devices, and a few Samsung phones.
That meant that for pairing my iPhone up to the Edge AI hearing aids, I had to use the MFi specification—which I’m not a fan of. Pairing was easy enough, but the range was poor (a few feet), the connection was unstable (audio would often cut out during streaming), and the streaming sound quality for music, TV, and hands-free calling was not good.
I’m glad Edge AI has Bluetooth LE Audio. We need to get to a place where all hearing aids use the same protocol—so users with different phones don’t end up with different feature sets with different efficacies. I wish I could’ve—and our customers could—immediately benefit from Edge AI having Bluetooth LE Audio, but this is more about future-proofing Edge AI hearing aids. As more and more devices are released with Bluetooth LE Audio, eventually this frustrating smorgasbord of Bluetooth protocols will come to an end and everyone will be better off.
I did really like that I I could turn off notifications from my phone
One of my gripes about the Phonak Sphere hearing aids I wore last week, was that every notification that came to my phone could be heard through my hearing aids. The only way I could disable those notifications was by breaking the Bluetooth connection between my phone and hearing aids, which I didn’t want to do. For instance, I’d be falling asleep on the couch, and would be awoken by an email alert or something through the hearing aids. Quite annoying.
With the Edge AI hearing aids—at least when used with an iPhone—you can use the native settings inside the phone to turn off notifications, so they don’t go to your hearing aids. You can do this by going to Settings->Accessibility->Hearing Devices-> Play System Sounds.
Battery Life
The battery life of Edge AI is astounding. On the larger RIC RT model, Starkey advertises a 51 hour battery life. On this smaller mRIC R model, it’s up to 41 hours, even with a few hours a day of Bluetooth streaming. Competing hearing aids with similar feature sets don’t even come close to that kind of performance. What’s really impressive here is the deep neural network tech doesn’t drain the batteries—here’s what I mean.
A primary competitor to this hearing aid is the Phonak Audeo Sphere Infinio, and while that hearing aid has its DNN features engaged, the battery drain triples, and battery life can be as short as 7 hours. The Edge AI are seemingly completely unaffected by the DNN in terms of battery drain. After a full charge in the morning, the lowest I ever saw my battery get by the end of the day was 65%. And then one night I forgot to put the aids in the charger—so they stayed on all night—and when I woke up they were still at 40%, after a full 24 hours of being on. I started tinkering in the app after this happened and found a feature called “Auto Sleep”, and if you activate that in the app, anytime you lay the hearing aids on a table for 10-15 minutes, they go into sleep mode, which drains the battery significantly less. You should activate this feature.
StarLink Edge TV Streamer Review
I was really looking forward to trying this because I wanted to test the sound quality of Bluetooth LE Audio, and I’d never used an Auracast device before. But as I mentioned previously, I came to find out—too late—that my iPhone did not have Bluetooth LE Audio, so I couldn’t use the Auracast feature (Auracast requires Bluetooth LE Audio to work).
At this point in time, I’m not sure I can recommend this accessory. I read the instructions closely and called Starkey technical support, and it still took me an hour to get this connected to my TV properly. Once I had finally set it up correctly, it worked very well for a few hours and I was relatively pleased with the ease of use.
Twice, within my two days of using the TV streamer, the TV streamer lost the pairing (Bluetooth connection with my phone)—meaning I couldn’t stream, and the only way to fix it was to:
- 1. Turn off the Bluetooth on my phone.
- 2. Take out the hearing aids and put them in the charger for a few minutes, and then pull them out (that’s how you turn them on and off).
- 3. Put the hearing aids next to the TV streamer and try to re-pair them (worked 50% of the time).
- 4. Turn the Bluetooth on my phone back on, and everything would work again.
That is a super annoying process that I would never want to fuss with. I am sure that the way the TV streamer is supposed to work is you only have to pair it to your hearing aids one time, and then everytime you want to use the streamer, you just start streaming via the My Starkey app. When it did work that way for me, it was great, it just didn’t work that way consistently.
The good news is, one of the promises of Bluetooth LE Audio is better connectivity, and with Auracast broadcasting, I am pretty sure that pairing process would’ve been streamlined.
Something to consider before buying the TV streamer
The TV streamer connects to your TV via your TV’s Digital Audio Out (Optical) port. Most TV’s only have one of these ports, and if you use a soundbar or any external audio speakers for your TV, you’re probably already using that port. That means if you plug in your TV streamer, you’ll lose the audio from your soundbar or external speaker. The solution for this is to buy an optical out splitter, which will give your TV two of those ports. And beware, using the TV streamer may cut off audio from the speakers for everyone else in the room, so that only you hear through your hearing aids—it just depends on the TV you have. If you buy the TV streamer, be prepared for a good amount of technical troubleshooting.
My recommendation would be to not buy this streamer, at least not when you initially buy the Edge AI hearing aids. Wear the hearing aids for a bit and see if you really need this accessory.
Starkey Edge AI Pros & Cons
Pros
Excellent performance in noise. Starkey has been using deep neural networks in their hearing aids longer than anyone, and it shows. Sure, the sound quality of speech degrades a bit when Edge Mode + is really “doing its thing” (working the most aggressively), but that’s a trade off many will be happy with in order to understand speech clearly in the most challenging environments.
It’s the best looking hearing aid in its class. I thought the noise reduction and speech enhancement of Edge AI was as good as Phonak Sphere (and better in some ways), and at less than half the size of Sphere. When I compare the aesthetics of Edge AI to its top two competitors with similar feature sets, there is just no comparison. Edge AI is lighter, smaller, and more discreet than competitor devices. In addition, Starkey makes Edge AI in 6 different styles including in-the-ear styles.
Best battery life in the industry, and disposable battery options are available. With up to 51 hours of battery life on the RIC R T model, you’re basically never going to have to worry about running out of charge as long as you charge them every few days. And if rechargeable batteries bother you, you can always order this in the RIC 312 model which uses a disposable battery that you’ll only have to change once a week or so.
It’s future-proof with Bluetooth LE Audio and Aurcast. It’s surprising that not all manufacturers are adopting this technology. Any hearing aid released in 2024 and beyond should include Bluetooth LE Audio. While most users will not notice an immediate benefit because many phones don’t yet have Bluetooth LE Audio, the rollout is expected to accelerate in 2025. Having this technology in your hearing aids now ensures you won't be stuck waiting another 4-5 years for your next upgrade to experience the advantages of Bluetooth LE Audio and Auracast.
Minimal circuit noise—Edge AI is a quiet hearing aid. All hearing aids generate a bit of low level white noise from the electronic components inside. Edge AI had the least amount of this noise compared to Starkey aids I've worn in the past, and compared to competitor aids I've tried it was also on the quieter side.
Cons
There is no way to automatically activate Edge Mode + at startup. The Edge Mode + feature is simply amazing and there were many times I preferred it over the 'Personal' program. I wish I had the option to automatically activate it when I put the hearing aids on in the morning. For now, you have to manually activate Edge Mode + when you want to use it, which if you’re anything like me, will be often.
With the Edge AI mRIC model, I wish there was greater flexibility with user control. Since tap control doesn’t work reliably, there is no way you’re going to be able to control everything you’d like using just the button on the hearing aid. You’re going to have to use the My Starkey app sometimes.
Some features don’t work very well. As I wrote about above, the Tap Control feature didn’t work as well for me as it did on competitor products, and while the Smart Assistant has great promise, it’s in its early days and it shows.
I wish there was an LED indicator light. I've found these lights to be helpful on other models. Specifically, to let me know that that the hearing aid is turning on or off when I want it to. Since there's no indicator light on this hearing aid, the only way to know it's successfully booted up is to put it on and see if you can hear.
A Battle of Chip Philosophies: Single vs Dual-Chip Architectures
As of August 2024, there are 3 hearing aids—Starkey Edge AI, Oticon Intent, and Phonak Sphere—that use deep neural networks to do their most important task—enhance speech clarity in noise.
Interestingly, Starkey and Oticon (the first to adopt DNNs), are convinced that embedding the DNN on the hearing aid’s main processing chip is the right move. They claim it leads to better power efficiency and lower latency. Phonak, on the other hand, is arguing that their dual-chip architecture—with the DNN having its own chip—allows for more dedicated resources for the DNN's functions, potentially leading to more advanced capabilities.
It’s not clear who the winner will be, but one approach is likely to prove superior in the long term. With computing power increasing, chips getting smaller, and consumer demand growing for streamlined devices, my money is on the single-chip architecture embraced by Starkey and Oticon.
With Edge AI, Starkey, in particular, has demonstrated a clear understanding of what consumers want—they're not just meeting today’s demands, but also anticipating future needs by rolling out features that users may not even realize they want yet.
As former Apple CEO John Sculley once said, "The future belongs to those who see possibilities before they become obvious."
Starkey is doing exactly that, paving the way for a bright future for both the company and its customers.