Breaking News
recent

Apple Music’s matching system goes by metadata only, drops acoustic fingerprinting

The Weeknd Bach
Apple Music is the company’s newest endeavor, and, like many endeavors before it, it’s a product that’s entering an already busy space with plenty of embedded competitors.
As such, and because it’s an Apple product as a whole, there’s still a spotlight on the product well after its launch alongside iOS 8.4 on June 30. Many folks are still tearing Apple Music apart at the seams to figure out how it works and how it compares to the other services available for (roughly) the same price each month for unlimited streaming. Of course, listening to your own owned library is one of the selling points of Apple Music: Having all of your music in one, easily accessible place.
Unfortunately, some folks aren’t having good luck with Apple’s matching service within Apple Music, which attempts to find tracks that are available within the music streaming service, and match them with what the listener may already own. For those that use, or have used, iTunes Match, as Kirk McElhearn points out, there’s a big difference between Apple Music and iTunes Match, right down to the way the services match music. For iTunes Match, it uses “acoustic fingerprinting,” which means that, despite whatever tags you might have attached to a track or album, iTunes Match will connect that original track with the same music — not just the tags.
As McElhearn suggests, this means that even if you add a 50 Cent tag to a Grateful Dead track, iTunes Match will correctly identify the song by the music, not just the tag, and match it accordingly.
However, Apple Music apparently drops this feature, and opts to use the metadata by default — and only the metadata. McElhearn ran a quick test utilizing a track from a collection of Bach chorales, and tagging it with “Can’t Feel My Face” by The Weeknd. Once Apple Music matched the file, he deleted the local copy and then downloaded the track from the cloud. The initial piece of Bach music was 1:57 in length, but once McElhearn downloaded the re-tagged song from the cloud, it came back as a track that lasts 3:36. And, when he played it, it was not Bach’s chorales, but instead “Can’t Feel My Face” by The Weeknd instead.
Susie Ochs Apple Music tweet
This can lead to other issues, too, as McElhearn notes that Apple Music can’t tell the difference between live songs or those recorded in a studio, either. So, as a tweet from Susie Ochs referenced in the article notes, Apple Music replaced the live tracks that were initially stored, and then subsequently replaced them with the studio versions of each track.
As McElhearn points out, this issue is made worse by the fact that Apple is already using a better matching method with iTunes Match and acoustic fingerprinting, so the utilization of this scattershot option makes little sense — especially considering it’s causing such headaches for users.
Are you having any issues with Apple Music?
Like this post? Share it!
Unknown

Unknown

No comments:

Post a Comment

Powered by Blogger.