Common Sense Media, a popular resource for parents, to review AI products’ suitability for kids
Common Sense, a well-known nonprofit organization devoted to consumer privacy, digital citizenship and providing media ratings for parents who want to evaluate the apps, games, podcasts, TV shows, movies, and books their children are consuming, announced this morning it will introduce another type of product to its ratings and reviews system: AI technology products. The organization says it will build a new rating system that will assess AI products across a number of dimensions, including whether the tech takes advantage of “responsible AI practices” as well as its suitability for children.
The new AI product reviews will particularly focus on those that are used by kids and educators, Common Sense notes.
The decision to include AI products in its lineup came about following a survey it performed in conjunction with Impact Research which found that 82% of parents were looking for a rating system that would help them to evaluate whether or not new AI products, like ChatGPT, were appropriate for children. Over three-quarters of respondents (77%) also said they were interested in AI-powered products that could help children learn, but only 40% said they knew of a reliable resource they could use to learn more about AI products’ appropriateness for their kids.
The new system will be designed with help from experts in the field of artificial intelligence and will also aim to inform forthcoming legislative and regulatory efforts around online safety for minors, the organization said.
Typically, Common Sense’s independent media ratings and reviews system, found at commonsensemedia.org, will provide an age-appropriateness rating alongside measures of how much positive or negative content the media may contain, across areas like positive role models and messages and diverse representation, or, on the more negative side, things like violence, vulgarity, scariness, drug use and more. It’s not clear how Common Sense plans to rate AI products under the new system.
However, in a position paper published in April 2023, Common Sense warned about several current issues with modern-day AI, including, often, their lack of “meaningful guardrails” that may make them dangerous for children. In addition, it warned that generative AI technologies, like OpenAI’s ChatGPT could be susceptible to bias in their training data, which includes large data sets like websites, books, and articles — like content scraped from the internet.
“Beyond concerns about copyright…the amount of data needed to train generative models almost guarantees that any generative model has been trained to some degree on biased information, stereotypes, and misinformation, all of which may be propagated or reproduced when producing output,” the paper had noted.
Common Sense did not share a timeframe as to when it expected its new AI rating and reviews system to launch, but stressed the urgency of building such a resource.
“We must act now to ensure that parents, educators, and young people are informed about the perils and possibilities of AI and products like ChatGPT,” said James P. Steyer, founder and CEO of Common Sense Media, in a statement. “It is critical that there be a trusted, independent third-party rating system to inform the public and policymakers about the incredible impact of AI. In recent years, a number of tech companies have put profits before the well-being of kids and families. We have seen this movie before, with the emergence of social media and the subsequent failure to regulate these platforms, and, unfortunately, we know how that version ends. Quite simply, we must not make the same mistakes with AI, which will have even greater consequences for kids and society,” he added.
In addition to its media ratings, the organization’s for-profit affiliate, Common Sense Networks, took inspiration from its kid-friendly recommendations with the launch of a streaming service called Sensical back in 2921. The service offers age-appropriate videos for children ages 2 through 10.