TALK ABSTRACT: My research group is focused on improving the viewing experience, refereeing, scouting, and sports performance aspects of sports through embedded systems. Specifically, I will focus on two ongoing research projects, YinzCam and Myron. YinzCam allows fans, right from their seats during a sports event, to create/view their own instant replays, catch the action from multiple live camera angles, get automated instant replays seconds after a play has happened (from all the different camera angles), and get real-time statistics, players' roster, etc., all on their own wifi-enabled smartphones, and all without violating broadcast rights. YinzCam was deployed as a pilot for the Pittsburgh Penguins in 2008 (including the 2009 Stanley Cup playoffs and the Stanley Cup Final), and is now deployed with multiple other teams/leagues. Myron uses a synergistic combination of sensors, communication protocols, computer vision, and machine learning techniques to provide enhanced tracking and motion analysis during practice or games for American football. We have currently developed a smart football that can be used to track the trajectory and landing position of the football in the field of play. We have also developed embedded coaching aids to help running backs, quarterbacks, wide receivers and punt kickers train reproducibly and independently of their coaches. The resulting data can also be used to indicate the performance of an individual player. This talk will highlight how YinzCam and Myron were conceived, built and deployed, and what we have learned (positive and negative results/experiences) in the process.