The European Union (EU) is expected to announce a proposal for artificial intelligence (AI) regulations next week.
The European Commission (EC), the EU’s executive body, has been preparing this proposal for more than a year, and major technology corporations are concerned that the EU will have too broad definitions of AI.
These regulations are part of the EU’s effort to regulate the AI sector and catch up with the US and China in an area that spans from voice recognition to insurance and law enforcement.
This draft provision would prohibit general supervision of citizens, as well as any technology used to manipulate citizens’ behavior, opinions or decisions. The application of AI in the military field is not within the scope of these regulations.
Violations of the aforementioned regulation, depending on the severity, can result in businesses being fined up to 4% of global revenue.
To encourage innovation, Brussels also wants to provide a clear legal framework for billionaires in all 27 member countries of the bloc.
To achieve this, the draft regulation states that companies will require a special validation of deemed “high-risk” applications before being allowed to hit the market.
High-risk systems are those that have “the function of identifying people in public places from a distance through biometric features”, as well as “factors of security in institutions. important public infrastructure ”.
In addition, applications not considered “high risk” will not be subject to any additional regulations other than the applicable ones.
Google and other major tech corporations are very interested in the EU’s AI strategy, as Europe is often the place that sets the standard for how technology is governed around the world.
Last year, Google warned the EU’s definition of AI was too broad and Brussels had to avoid excessive control over a key technology.
The above draft regulations must be ratified by member countries and get the approval of the European Parliament before they can be applied./.
You must log in to post a comment.