5 January 2022
of the manufacturer of AI systems, insofar as the The regulation sets out a special procedure for latter can weaponize the opacity of AI systems to processing sensitive data12 and safeguards specific effectively shield itself from discrimination claims. [...] Personal data protection regimes can chip at 11 EU, Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with exploitative data harvesting and discriminatory regard to the processing of personal data and on the free movement of profiling. [...] Similarly, follows: “In order to protect the right of others a think tank backed by US companies projects from the discrimination that might result from that the AI Act will cost the European economy the bias in AI systems, the providers should be €31 billion in the next five years and reduce AI 17 able to process also special categories of personal investments by 20 percent. [...] For instance, data trusts could counter the “dataopolies” of the “Frightful Five” (Alphabet, Policy makers can compel the business sector Amazon, Apple, Facebook and Microsoft) and to disclose the performance metrics of their AI rebalance the power asymmetry between systems. [...] The flexibility the European Union’s AI Act or via technical pertaining to the use of standards, the type of standards, co-regulation favours buy-in through assessment and the specific third party performing an increased sense of ownership for private actors it illustrates the potential of co-regulation.