Artificial intelligence (AI) has permeated every aspect of our lives, from rewriting famous novels to composing songs and creating realistic videos. While older generations may fear its dominance, today’s youth are wholeheartedly embracing this technology. According to a November 2023 study conducted by the Pew National Research Center, nearly one in five teenagers between the ages of 13 and 17 who are aware of the AI chatbot ChatGPT have utilized it to assist with their schoolwork. This amounts to approximately 13% of all teenagers in the United States.
The same study also revealed that seven out of ten teens believe it is acceptable to use chatbots like ChatGPT when conducting research and exploring new topics. However, these findings raise concerns, especially in terms of AI generating misleading or false information. Google’s AI chatbot Gemini recently came under scrutiny for producing inaccurate and “woke” depictions of historical scenes, prompting the company to issue an apology.
There is no denying that AI has become an integral part of the younger generation’s lives. However, questions persist regarding the best practices for integrating AI into youth education in a productive and safe manner. Cointelegraph spoke with Brandon Da Silva, CEO of ArenaX Labs, to gain insights into how AI can be effectively implemented in youth education.
ArenaX Labs recently launched AI Arena, a player-vs-player fighting game that allows players to train AI models to autonomously battle each other. The primary objective of this game is to enhance AI literacy through gameplay. According to Da Silva, teaching young people how to train and program AI goes beyond simply using tools like ChatGPT and asking questions.
Da Silva emphasized the importance of understanding why AI tools like ChatGPT provide certain answers. He warned against blindly accepting the information provided by AI as gospel and stressed the need to critically evaluate its responses. He believes that children who interact with AI from a young age are more likely to become technologically savvy compared to their peers who do not.
Da Silva drew a parallel between those who learn programming at a young age and their advanced skills compared to some full-time adult programmers. However, he acknowledged that the issue of AI usage among children is multifaceted, as excessive reliance on technology can lead to potential risks if not monitored closely.
Proper AI education and guidance from educators who utilize AI are crucial in addressing these challenges. Da Silva emphasized the need for developing skills to identify biases in AI, similar to spotting deepfakes. He believes that teaching these skills to the youth is essential, and it is important to tailor AI interactions to suit different types of learners.
Another significant aspect to consider is the emotional connection that individuals can develop with AI. Research from the Digital Wellness Lab suggests that children can form one-way emotional attachments, known as “parasocial relationships,” with AI-enabled digital assistants. The study found that a majority of children aged six to ten considered their familiar digital assistant to be smart and even viewed it as a friend.
Da Silva highlighted the need to recognize these emotional connections and acknowledged that objectivity may take a backseat when people develop such bonds with AI. This can lead individuals, especially the youth, to trust AI as an authority without fact-checking its information.
As AI continues to evolve rapidly, it is a critical moment for society. Humans are still grappling with understanding the technology itself. However, this presents an opportunity for today’s youth to safely learn and engage with a tool that will undoubtedly shape their future.