Esports games can be incredibly fast-paced and complex to follow, but a new tool, Echo, aims to make esports more engaging to watch for the 350 million-strong esports audience. Having launched at ESL One Hamburg, Echo automatically detects extraordinary performances in live matches and lets the operators generate visual overlays that illustrate data-driven stories about players to the live audience.
Echo was developed here at the Digital Creativity Labs at the University of York, in close collaboration with our partners at ESL, the world’s largest esports company. Echo is the first of a series of new production tools that translates large volumes of esports metrics into visually appealing and informative stories, augmenting the viewing experiences for millions of esports fans. The product is the first being released via a collaboration between the University of York, one of the top Universities in the UK, and ESL, drawing on Digital Creativity Labs’ research in Information Visualisation, Game Analytics, Machine Learning and Interactive Broadcasting.
York is one of the oldest cities in the world and famed for its historic buildings dating back to the Roman conquest of Britain. At the University of York, next to an idyllic duck pond in the heart of Northern England, the old is meeting the new: a team of interdisciplinary scientists work with the esports industry to build innovative tools for the purpose of making esports more engaging to watch, interact with, and accessible to more people. The first in a series of tools that are being rolled out, Echo, provides esports tournament organisers with the ability to automatically detect extraordinary plays and events in live matches and generate graphics that helps the audience appreciate what is happening in the extremely fast-paced and complicated world of professional gameplay.
In order to make this happen, we assembled a database of performance data from professional players and teams. Echo constantly monitors live data and compares it against thousands of previous professional performances. Whenever something in the current match is unusual, this is brought to the attention of the Echo operators. For example, when a record is broken, or when a player’s performance is extraordinarily bad or good. Echo provides a direct interface with the graphics system so with one click, the operator can send the selected statistic to the audience. Echo thus provides an end-to-end mechanism to convert extraordinary events into meaningful graphics that the audience can understand.
While real-time monitoring of match data is accessible for some pro level analysts and teams, Echo goes beyond monitoring by providing an interface for the broadcaster to easily access the data, identify meaningful moments and seamlessly pass this information onto the audience.
We have many more tools under development. The Echo team is led by my incredibly talented colleague Dr Florian Block, and is working on apps that tell visual stories with data, the application of AI techniques to help content creators extract value from esports data, and more. With the availability of so much data, combined with a tech savvy audience, esports provide an incredibly innovative space. As Florian noted in the press release: “When tradition and new comes together this often creates the space for radical innovation. Esports – combining elements from traditional sports, video games and interactive technology – provide an incredibly fertile ground for exploring entirely new ways for fans to engage with and benefit from playing and watching esports”.
Across all of these projects being researched and developed in Digital Creativity Labs, our goal is to give the teams, broadcasters and audiences new tools for making esports engaging, and to make the games more accessible to ever-growing audiences.
For more about the Digital Creativity Labs and its esports team, see here.
For more about the ESL ONE Hamburg, see here.
For more about the ESL, the world’s largest esports company, see here.
A sample clip showing Echo in action can be found here.
(Originally published on the DC Labs website here).