Watson delivers a user experience that excites the “Grammy Awards” -Real-time display of information in live video, IBM co-developed “GRAMMY Insights with IBM Watson”

IBM and the Grammy-sponsored National Academy of Records Arts and Sciences announced that they have co-developed a system using IBM Watson to extract insights from vast amounts of artist information and deliver compelling user experiences. ..

It will be used for the live broadcast of the 64th Grammy Awards starting at 5:00 p.m. April 3 (Eastern Time, 7:00 a.m. April 4 Japan time), and on the artist page of the Grammy Awards website (Grammy.com ). It has been available since March 18.

The new “GRAMMY Insights with IBM Watson” (hereafter referred to as “GRAMMY Insights”) collects millions of articles and information from thousands of data sources and is to be scored by IBM Watson Discovery, which has natural language processing. , rated by humans, and previews are displayed in real time on live images in which artists appear.

“GRAMMY Insights, an intelligent workflow, not only delivers a compelling user experience for music fans everywhere, but also demonstrates how AI can change the way users experience and consume content.” IBM appeals to its importance.

Overlay Previews
Overlay Previews

IBM explains the GRAMMY Insights, so let’s introduce it.

First, there are two development themes, one is to improve the digital experience to increase user engagement, and the other is to implement features that allow the National Academy of Records team Arts and Sciences to be more specialized.

IBM designed and developed the figure below as a mechanism for this. It was developed based on Red Hat OpenShift for users and academic teams to use and scale flexibly, and adopted IBM Cloud Code Engine for serverless computing.

GRAMMY Insights with IBM Watson Architecture
GRAMMY Insights with IBM Watson Architecture

According to IBM, the system is divided into two phases: “knowledge analysis” and “fact finding”. Let us detail the technical points of the explanation.

◎ Knowledge retention

❶ Build a corpus by extracting information from over 100,000 news sites and articles/content on Wikipedia, Dragnet and GRAMMY.com.

❷ Designed “Insight Generator” as a broker function for all messages and data streams. Developed based on Node.js.

❸ Adoption of a persistent Redis in-memory database as a scalable state management mechanism. Widely used in all application.

-Most internal processing is handled by Bull, the Redis-based queue manager. To maintain state persistence across multiple applications.

❹ At system startup, Insight Generator queries the application against IBM Cloud Code Engine for a four-level list of artist priorities.

❺ Searched articles are sent to “Extractive Summarization” on Red Hat OpenShift, and IBM Watson Discovery (NLP functionality) extracts the most distinctive text.

❻ At this point, Insight Generator stores all pseudo-facts, articles, and metadata about the mining results.

◎ Discovery of facts (measuring the polarity of insight)

❼ Over 408,000 IBM Project Debater and Lexis Nexus samples were trained using the BERT model (natural language processing model) to determine how well extracted text supports the artist.

❽ The resulting data (polarity) is stored in IBM Cloudant NLP Store and updated by Redis.

➒ High quality factoids are tagged with a concept for each category.

➓ IBM Natural Language Processing’s custom entity detection system, Statistical Information and Relation Extraction (SIRE), generates entities. This entity is used to further filter facts with additional categories, and the remaining facts are persisted to IBM Cloudant NLP Store with the corresponding entity.

⓫ Once the fact is approved, the generated JSON will be uploaded to IBM Cloud Object Storage.IBM Content Delivery Network (CDN)でPre-processing is done to send the data to Grammy.com (⓭).

The above AI pipeline consists of 7 applications with 6 Docker images and 54 pods. As a result, data mining and information discovery for the 1000 Grammy nominees could be completed within an hour.

The pipeline was developed with Visual Studio Code. When the code is ready, commit your changes and submit them to GitLab. Run the deployment runner to create a Docker image on your GitLab machine and push it to your Red Hat OpenShift image repository. And when the app detects an image change, the new image is deployed to all app-specific pods and published to the root for use as a REST service. The rollout of the changes is supposed to be released in a Canarian fashion so that the current workload is not interrupted.

GRAMMY Insights development flow with IBM Watson
GRAMMY Insights development flow with IBM Watson

The 64th Annual Grammy Awards, which will be streamed live on Grammy.com on April 3 (starting at 7:00 a.m. April 4 Japan time), is expected to be watched by tens of millions of music fans around the world. .. The IBM and Academy teams are ready to provide ideas for red carpet celebrities and nominees.

Press Release “IBM and National Academy of Record Arts and Sciences Offer Fan Experience with Watson at 64th Annual Grammy Awards and Artist Page”
・ Engineer’s blog “Transforming data into insight at the GRAMMY Awards” (English)
https://developer.ibm.com/articles/transforming-data-into-insight-at-the-grammy-awards/

[i Magazine・IS magazine]

Leave a Comment