Dr. Per Fröjdh
Per Fröjdh is Director of Media Standardization at Ericsson and has thirty years’ experience in research and innovation. He received his MSc and PhD degrees after studies at Chalmers University in Gothenburg, Sweden, and at Imperial College in London, UK. Subsequent to postdoctoral appointments in Seattle, USA, and Copenhagen, Denmark, he accepted a faculty position at Stockholm University in Sweden as Professor of Theoretical Physics before joining Ericsson as Manager of Video Research and Standardization. Per has contributed to standards for video compression, streaming and file formats in MPEG, served on the advisory committee for the W3C, and has been the editor of 15 standards in MPEG, ITU, 3GPP and IETF. Currently he is the Multimedia Chair at the Swedish Standards Institute and Head of Delegation to MPEG for Sweden. He is also Board Director, Treasurer and Promotions Chair of the DASH Industry Forum as well as Steering Board Member of DVB.
5G Roadmap for Media – Standardization of Mobile AR / VR and Media Production
Mobile Virtual Reality (VR) and Augmented Reality (AR) applications have become popular together with a wide range of head-mounted displays and similar devices powered by smart phones. By standardizing immersive audiovisual content formats and streaming protocols in MPEG and other organizations, it is possible to reach different devices with the same content and deliver services to many users in a scalable fashion. True immersive experiences also pose strict requirements on mobility, high data rates and minimized delays, which is why the emerging 5G telecom standard provides the basis for delivering high quality VR and AR.
5G is becoming relevant for Media Production vertical for different production scenarios, like in-door or hot-spot venues. Professional camera are getting smaller and new type of cameras are becoming available with SmartPhones and action cameras. 5G is offering much higher bitrates and a much higher system capacity at smaller latencies. With higher bitrates and lower latencies, professional and semi-professional cameras can become wireless, streaming data from different scenarios directly into the studio. The system features such as Network Slicing, Edge Computing and local breakout can bring media production components closer to the cameras.