Meta has opted not to sign the European Union’s code of conduct for its AI Act as new regulations for general-purpose AI models are about to be introduced. Joel Kaplan, Meta’s head of global affairs, voiced concerns in a LinkedIn update, claiming that Europe is taking an inaccurate path regarding AI. He noted that while the European Commission’s Code of Practice intends to create a compliance framework, it presents considerable legal uncertainties for developers and includes requirements that go beyond the AI Act’s goals. Recently published, the EU’s voluntary code seeks to help comply with forthcoming AI regulations, mandating companies to provide and regularly update documentation about their AI tools, avoid using pirated material for training, and respect content owners’ requests to exclude their works from datasets. Kaplan argued that the EU’s regulatory measures could stifle the advancement of innovative AI technologies in Europe and negatively affect local businesses aiming to utilize such technology.
The AI Act classes various applications based on risk, strictly prohibiting certain “unacceptable risk” uses like cognitive manipulation and social scoring, while labeling applications like biometrics and facial recognition as “high-risk.” It requires developers to register their AI systems and meet risk and quality management standards. Major global technology companies, including key players like Alphabet, Meta, Microsoft, and Mistral AI, have opposed these EU regulations and requested a delay in the implementation of the timeline. However, the European Commission remains committed to the original schedule. Moreover, the EU has issued guidelines for AI model providers ahead of regulations coming into effect on August 2, which will influence suppliers of “general-purpose AI models with systemic risk” such as OpenAI and Google. These companies must ensure compliance by August 2, 2027, if their models were available before that date.
The ainewsarticles.com article you just read is a brief synopsis; the original article can be found here: Read the Full Article…