Mobile Development 8 min read

Design and Development of a Siri‑Like Voice‑Controlled Music iOS App

This article walks through the design and implementation of a voice‑controlled music iOS application using Siri SDK, Sketch and Principle for UI prototyping, and Xcode with Objective‑C and SpeechKit for speech recognition, culminating in a functional prototype that searches iTunes and plays song previews.

Ctrip Technology
Ctrip Technology
Ctrip Technology
Design and Development of a Siri‑Like Voice‑Controlled Music iOS App

The session introduces a Siri‑like voice‑controlled music app, explaining how the newly released Siri SDK in iOS 10 enables developers to create voice‑interactive experiences, and briefly reviews the evolution of AI assistants.

Design part: Using Sketch and Principle, the team creates UI prototypes for the app, illustrating the microphone interaction, animation flow, and page transitions. Sketch provides lightweight UI design, while Principle adds animated interactions such as a rotating microphone to indicate listening.

Images of the Sketch mockups and Principle animation sequences are shown to demonstrate the prototype workflow.

Development part: The prototype is implemented in Xcode with Objective‑C. Speech recognition is added via Nuance Speech Kit 2 iOS SDK. The project uses CocoaPods to manage dependencies; the Podfile includes the line pod'SpeechKit' . After running pod install , the workspace is opened and the SDK is imported with #import <SpeechKit/SpeechKit.h> .

Developers must register on Nuance to obtain a server URL and app key, then configure the SDK to start a speech recognition session and handle the delegate callback to retrieve the best‑matched text.

Once the spoken query is recognized, the app queries the iTunes Search API (e.g., https://itunes.apple.com/search?term=牛仔很忙&limit=1 ) to fetch album artwork and a preview URL, which are displayed on the third prototype page.

Basic UI animations, such as microphone rotation, are implemented with CABasicAnimation . The final result is a fully functional voice‑controlled music app that demonstrates the integration of design tools, AI speech recognition, and iOS development techniques.

For more details and a video walkthrough, viewers are invited to watch the original Ctrip Tech micro‑share recording.

mobile developmentiOSXcodeObjective‑Cvoice controlSpeech RecognitionSiri SDK
Ctrip Technology
Written by

Ctrip Technology

Official Ctrip Technology account, sharing and discussing growth.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.