Apple's company purchases are usually shrouded in secrecy - it takes a few years until you can see what the company intended to do with the purchase.

When Apple bought Shazam at the end of 2017, it was more or less clear what Cupertino's developers would do with the purchase: The music recognition service would benefit Apple Music. Indeed, this is what happened: Shazam is now firmly integrated into Apple’s system and the iPhone now recognises music from within the Control Center - you don’t even need to use the app.

Unlike other apps that Apple has purchased Apple has not discontinued the Shazam app. In fact event the Android version is still available via the Google Play Store.

Apple has switched off advertising in the iOS app, and the Android version is also ad-free.

Immediately after the takeover, we speculated that Apple would benefit from the purchase mainly on the Android front, since the app also has a fairly large installation basis on Android smartphones.

This was in directly confirmed when Apple presented a new Shazam playlist this spring: "Shazam Predictions 2001". The playlist is based on the millions of requests from users around the world who want to know about a song. Apple’s Shazam playlist features 50 emerging artists who, based on the Shazam data, are poised to have a breakthrough year. Apple can now make money by presenting these songs that are sure to be hits based on the data.

This is where the Android faction comes into play: If Apple had discontinued the Android app almost three years ago, its music analysis department would have lost the data from several hundred million users who had installed the app on the Android.

More than music recognition

But Apple Music and music recognition on the iPhone is nothing compared to the announcement that Apple made at WWDC: Shazam’s sound recognition technology will be made available to third-party apps. From iOS 15 onwards apps will be able to use their own libraries for sounds, words, etc. using Shazam technology.

The video below shows what could possible with this technology. Immediately after the announcement a developer made a word counter for spoken language. For example, you can determine how many times Tim Cook used the word "amazing" in the keynote.

This could be a fun tool to use, but it merely shows the tip of the iceberg. Shazamkit could be capable of so much more in third-party apps. During the presentation Apple demonstrated a learning app that makes a linearly recorded video an interactive experience. In this example the app can recognise if the student asks a certain question of the teacher and displays question-and-answer modules to help the student understand the material.

Shazamkit can be used to give audio content an additional interactive layer. For example, the Apple Watch could display an alert when someone rings the doorbell, and the same could be applied to other devices and noises with Shazamkit.

In short: Apple has bought Shazam as a quite useful feature for its streaming service Apple Music, but has continued to invest in and develop it for its own resources. Hence a gimmick can become a basis for great apps.

This article originally appeared on Macwelt. Translation by Karen Haslam.