HELSINKI–(BUSINESS WIRE)–#AI—Saidot, a leader in AI governance, is rallying leaders worldwide to confront the hidden risks of AI and push for a more transparent and collaborative approach to AI safety. This campaign aims to make AI governance relevant and inclusive to everyone by turning a metaphor into reality and sending influential figures bespoke six-fingered gloves to symbolise the risks within AI models which can often slip by unnoticed.
Saidot’s campaign – titled «What’s Your 6th Finger?” – plays on a well-known tendency of AI models to generate images of people with unexpected additional fingers. This symbolic ‘6th finger’ is a powerful reminder to all AI stakeholders of the need for rigorous and continuous governance throughout the AI product lifecycle. Common risks that are often overlooked, despite their real-world impact, include:
- Biased outcomes – AI algorithms often have biases in information processing that can lead to ethical problems when used, for example, in recruitment processes.
- Copyright infringement – AI might be trained on copyrighted content, potentially exposing users to copyright infringement without even realising it.
- False information – AI still tends to randomly fabricate information. This phenomenon is known as hallucination, and it can lead to serious problems if left unchecked.
To make these concerns more accessible and relevant to a broader audience, Saidot has sent bespoke six-fingered gloves to prominent figures shaping the future of AI to symbolise these hidden risks. This includes developers, policymakers, and influential public figures, including:
- Sam Altman, CEO of OpenAI: Recognised for delaying the public release of the highly realistic video generator Sora until it is safer, Altman’s gloves serve as a reminder to align AI with human values by encouraging ecosystem collaboration.
- Ursula von der Leyen, President of the European Commission: Acknowledged for her leadership in enacting the EU AI Act, von der Leyen’s gloves represent the need for effective and innovation-friendly implementation of regulation.
- Mark Zuckerberg, Founder, Chairman and CEO of Meta: Acknowledged for the significant investments that Meta has made in open source AI models, Zuckerberg’s gloves serve as a reminder of the need to ensure that all training data for AI models is sustainably sourced.
- Scarlett Johansson, actress and celebrity: In 2023, Scarlett Johansson was praised for being outspoken about the misuse of deepfakes after OpenAI used a voice that was strikingly similar to hers in its ChatGPT product. Johansson’s gloves serve as a reminder that everyone owns rights to their voice and likeness.
Alongside these high-profile individuals, Saidot has sent these symbolic gloves to UK Prime Minister Keir Starmer, Clément Delangue, CEO of Hugging Face, Henna Virkkunen, EU Commissioner-Designate, Sebastian Siemiatkowski, CEO of Klarna, and Rishi Bommasani, Society Lead at Stanford Center for Research.
Meeri Haataja, CEO and Co-Founder of Saidot, said: «AI systems are often likened to a ‘black box,’ with layers of complexity and risks that remain hidden from view. However, it cannot be used as an excuse for not solving the problems that are already well known. Furthermore, responsible AI governance requires action and collaboration from all parts of the AI value chain – that’s why we have also recognised actions by different types of operators that contribute to a more responsible AI ecosystem.»
Veera Siivonen, CCO and Co-Founder of Saidot, added: «AI is developing so fast that nobody can fully anticipate its impacts and the emerging risks. That’s why we want to highlight both the steps that have been taken forward for safer AI, as well as some of the steps that should be taken. Moreover, as AI is such an essential technology which can profoundly impact our lives, we must ensure everyone understands that in addition to the great progress AI brings, there are also significant risks involved. The gloves are a way to make these concerns more tangible and understandable for all.”
Contacts
Veera Siivonen
[email protected]