top of page

Retain Human Element In Sports’ GenAI Use Cases

  • Writer: Julie Ask
    Julie Ask
  • Mar 19
  • 2 min read

Updated: Apr 28

Background: I had the privilege to host a panel of female executives in sports management for a panel discussion at Oakland Roots and Soul + Northeastern University’s inaugural Roots & Rising event on February 28th. Our topic was the role of technology and data in sports management. The conversation focused primarily on the use of insights to make smart business decisions in sports. Naturally, we included a conversation about AI and generative AI.


Here is a bit of context: 


  • The panelists each had unique goals for objective functions. They included generating revenue while serving a community, ensuring post-graduation success of student athletes, and helping professional football (the European type) managers make smart decisions about player acquisitions.

  • Their near to medium term use cases for generative AI included: interfaces to get answers to questions, video analysis (i.e., product), drafting press releases, and conducting background research online. Think more enhancement of a human’s ability rather than a replacement.

  • The median age in the audience was probably about 30 years of age with (probably) zero engineering, data science, or STEM degrees (based on a show of hands). More than 95% were women, and most had played sports. Very diverse ethnic group.


Why are there concerns about fully agentic uses of technology or automation without supervision? What did we discuss?


  • Panelists and members of the audience were concerned about the training data because they weren’t sure it represented them well enough.

  • The right answer may require judgement i.e., not be obvious or binary answer. Sports is incredibly dynamic including value, pricing, and more. A single athlete’s injury, arrival, or departure from a team can rapidly change the value of a ticket. In another scenario, giving access to a local community must be balanced with profits.

  • In the context of an intake mechanism for information from student-athletes, we discussed trust and privacy issues. Were students more likely to be honest with a bot than a human? Maybe. As someone who has fielded a lot of consumer surveys, I mentioned that consumers tend to over report how often they work out or how frequently they consume alcohol or sugar. Similar when they speak to their physician.

Comments


bottom of page