- Semiconductor Technology Now
Report Series
Referees make video refereeing calls
Meanwhile, video refereeing decisions are made by the video assistant referee (VAR). For example, in rugby and soccer matches, a VAR in the video room constantly views video images captured from various angles and replays the necessary videos instantly to check when something unclear has taken place. The VAR skillfully utilizes functions such as slow-motion replay and zooming to make the call. The FIFA World Cup held in Russia in 2018 used 12 to 13 VARs. It is said that all of them were positioned in Moscow and made calls by viewing videos sent from 12 game venues through optical fibers.
A VAR’s job is to assist with refereeing calls. Regarding its effect on refereeing, some people claim that the VAR system takes too long or stops the game's flow. However, the system is said to have definitely improved the quality of play. For example, the number of simulation acts at the 2018 World Cup in Russia was said to have been down by 43.7% from the preceding year’s level. In addition, the number of red cards issued was only four. This could be considered a dramatic improvement in the quality of play since the number of red cards given in prior World Cups was in the double digits.
Regarding dangerous plays during games, doctors rather than referees sometimes use this system as the first step of treatment. In rugby, for example, when a player suffers a concussion, the doctor always tries to determine whether the hit was to the player’s chin or temple before deciding on the proper treatment, even if the player says he is fine.
Further technological improvements will raise accuracy
Video refereeing has improved in accuracy largely due to technological advances. The system zooms in so that the ball can be identified, and 30 images per second are sent to the stadium’s computer (in ordinary TV broadcasting, images are projected at the rate of 30 frames per second*1), and the received images are stitched together inside this computer. For example, the images are stitched together while gradually shifting the camera viewpoint beginning from a front view moving toward the right, 180 degrees to the far side. Then, the view continues rotating from that point around the stadium and returns to the original front position. This is image synthesis.
The technology used for drawing graphic images from synthesized video images to show the ball's movements on the line, as happens when a player makes a challenge in tennis, is the same surround-view technology used when parking a car. To move images from various angles little by little, the images are synthesized and displayed. However, to change viewpoints, coordinate transformation is used to display the car in graphics as though the car is being viewed from above, as in surround view. In tennis, the images of the ball, which are changing over time, are slightly overlapped and simultaneously overlaid with the ball’s trajectory. However, to display them more quickly, images of the surrounding area are omitted since showing them would lengthen the processing time, and the ball is drawn in graphics.
To maintain the quality of large images, such as those of a stadium, high-resolution images of 4K or 8K are sent instead of ordinary high definition (HD). To make this possible, images are sent to the computer at a high bit rate using optical fibers. Since stitching images together inside the computer would take too long, graphic images of the ball are drawn and overlaid on the images. Of course, since even this involves a massive amount of computation, a high-speed computer is required.
Additionally, for use in making refereeing calls, if the frame rate (the number of image frames drawn per second) is increased to 120 frames/sec or even higher to 240 frames/sec, an extremely fast computation speed becomes necessary, thus requiring a high-end central processing unit (CPU). For this reason, Intel Corporation, a high-end CPU maker in the U.S., has developed a 360-degree system called “True View” as an application example of its CPU. However, the company says the number of actual adoptions is still small.
In addition to hardware, software algorithms are also used to reduce the computational load, such as erasing the images of the surrounding area when displaying graphic images and expressing only the ball in simple graphics.
In video refereeing, referees are still making judgment calls for the time being. However, if AI is used to define fair play clearly, it will become possible for AI to make judgment calls. It will be necessary to develop technology for defining fair plays and have the computer learn them. Since AI-based decision-making is nearly real-time, this would mean no more waiting for the challenge system to produce tennis's final decision.
Also, adopting technology for education
It is said that the kind of refereeing systems described so far are beginning to be applied to club activities and sports education in junior and senior high schools in some cases. Fair refereeing and the spirit of sportsmanship should reduce the number of simulation acts in soccer and guide children’s involvement in sports to a healthy one. Since the systems introduced in this series of articles are highly effective in preventing injury, they are expected to be used in fitness gyms and also for cultivating healthy sports for youth.
[Note]
- *1
- While movies use the speed of 24 frames/sec, TVs use 30 frames/sec to show moving images. One frame means one image. Moving images are created by continuously showing static images at the rate of 24 or 30 per second.
[References]
- 1.
- Why Sony will be able to dominate the market in sports tech. What is so great about its subsidiary Hawk-Eye? (August 6, 2018)
https://www.sbbit.jp/article/cont1/35246
- 2.
- Behind-the-scenes look at Hawk-Eye technology, which is ubiquitous in professional tennis tournaments and has changed the world of refereeing (TIME & SPACE, May 26, 2015)
https://time-space.kddi.com/digicul-column/world/20150526/
Writer
Kenji Tsuda
International technology journalist, technology analyst
Kenji Tsuda is a freelance technology journalist who writes both in English and Japanese. With over 30 years of work experience covering the semiconductor industry, Tsuda has been offering various insights to the industry through his blog (newsandchips.com) and analytical articles. He is editor in chief of the Semiconportal site (www.semiconportal.com) and writes the “Car Electronics” article series for Mynavi News site as a columnist.
Tsuda started his career as a semiconductor device development engineer, before becoming a reporter for the Nikkei Electronics magazine at Nikkei McGraw-Hill (now Nikkei BP). At the company, he created several magazines including Nikkei Microdevices (in Japanese), Nikkei Electronics Asia, Electronic Business Japan, and Design News Japan (in English), and Semiconductor International Japanese Edition. Tsuda went freelance in June 2007 as an international technology journalist. Books he authored in Japanese include Megatrend in Semiconductors 2014-2023 (Nikkei BP), Why We Shouldn’t Let Go of the Semiconductor Industry (Nikkan Kogyo Shimbun, Ltd.), The Truth about the European Fabless Semiconductor Industry (Nikkan Kogyo Shimbun, Ltd.), The Latest Trends in Green Semiconductor Technology and New Businesses 2011 (Impress Corp.).