Abstract
Millions of people nowadays share their television (TV) experience with other people through social media like Twitter or Facebook with mobile devices. It is generally believed that TV has been repurposed for social networks, called as social television. A key functionality of social television is that it allows the viewers to interchange their comments through mobile devices, thus creating the impression of watching TV like alongside a group of friends. To do so, mobile devices have to be aware of the current media context (identifier and progress) of what the TV is playing, known as synchronizing context from TVs to mobile devices. Unfortunately, most legacy systems are not able to track the playing progresses or need to upgrade TV devices. To address the issue, we design a purely software-based solution, called as TagScreen, which inserts a series of hidden sound markers into the audio of the content. TagScreen supports longrange or multipath-resistant synchronization at second-level, being independent of device diversity over severely frequency-selective acoustic channels. We implement TagScreen by using COTS TVs and mobile devices. The system has been extensively tested on 150 movies and 150 TV series across five different environments. Results show that TagScreen has a mean recognition accuracy of 98% up to 35m, and a mean tracking accuracy of 97%.
Original language | English |
---|---|
Title of host publication | INFOCOM 2017 - IEEE Conference on Computer Communications |
Publisher | IEEE |
ISBN (Electronic) | 9781509053360 |
DOIs | |
Publication status | Published - 2 Oct 2017 |
Event | 2017 IEEE Conference on Computer Communications, INFOCOM 2017 - Atlanta, United States Duration: 1 May 2017 → 4 May 2017 |
Conference
Conference | 2017 IEEE Conference on Computer Communications, INFOCOM 2017 |
---|---|
Country/Territory | United States |
City | Atlanta |
Period | 1/05/17 → 4/05/17 |
ASJC Scopus subject areas
- General Computer Science
- Electrical and Electronic Engineering