{"id":4732,"date":"2025-09-29T19:17:05","date_gmt":"2025-09-29T19:17:05","guid":{"rendered":"https:\/\/palmer-consulting.com\/ia-act\/"},"modified":"2025-09-29T19:17:05","modified_gmt":"2025-09-29T19:17:05","slug":"ia-act","status":"publish","type":"post","link":"https:\/\/palmer-consulting.com\/en\/ia-act\/","title":{"rendered":"IA Act"},"content":{"rendered":"<h1 data-start=\"249\" data-end=\"346\">The IA Act: everything you need to know about Europe&#8217;s first artificial intelligence regulation<\/h1>\n<h2 data-start=\"348\" data-end=\"363\">Introduction<\/h2>\n<p data-start=\"364\" data-end=\"689\">In 2024, the European Union adopted the<strong data-start=\"402\" data-end=\"412\">AI Act<\/strong> (or <strong data-start=\"417\" data-end=\"448\">Artificial Intelligence Act<\/strong>), a major piece of legislation framing the development and use of artificial intelligence (AI) in Europe. It is the <strong data-start=\"569\" data-end=\"616\">world&#8217;s first legislation dedicated to AI<\/strong>, and aims to become an international reference standard. <\/p>\n<p data-start=\"691\" data-end=\"835\">The aim is twofold: to <strong data-start=\"715\" data-end=\"741\">encourage innovation<\/strong> while <strong data-start=\"750\" data-end=\"819\">guaranteeing security and the protection of<\/strong> citizens&#8217; <strong data-start=\"750\" data-end=\"819\">fundamental rights<\/strong>.<\/p>\n<p data-start=\"837\" data-end=\"1030\">In this article, we explain in detail what the AI Act is, its main provisions, its impacts for businesses, and why it represents a turning point in AI regulation.<\/p>\n<hr data-start=\"1032\" data-end=\"1035\">\n<p data-start=\"158\" data-end=\"607\">The<strong data-start=\"160\" data-end=\"170\">AI Act<\/strong> (Artificial Intelligence Act) is the first <strong data-start=\"216\" data-end=\"238\">European regulation<\/strong> entirely dedicated to artificial intelligence. Unlike a directive, this regulation is <strong data-start=\"338\" data-end=\"414\">directly applicable in all EU member states<\/strong>, without the need for transposition into national law. This guarantees Europe-wide <strong data-start=\"490\" data-end=\"518\">harmonization of rules<\/strong>, avoiding disparate legislation from one country to another.  <\/p>\n<p data-start=\"609\" data-end=\"852\">Its main objective is to <strong data-start=\"639\" data-end=\"673\">create a clear legal framework<\/strong> for the design, development, deployment and use of artificial intelligence systems. This framework aims to reconcile two strategic priorities: <\/p>\n<ol data-start=\"853\" data-end=\"1143\">\n<li data-start=\"853\" data-end=\"958\">\n<p data-start=\"856\" data-end=\"958\"><strong data-start=\"856\" data-end=\"903\">Encouraging innovation and competitiveness<\/strong> of European companies in the field of AI.<\/p>\n<\/li>\n<li data-start=\"959\" data-end=\"1143\">\n<p data-start=\"962\" data-end=\"1143\"><strong data-start=\"962\" data-end=\"1016\">Protect citizens and their fundamental rights<\/strong> in the face of potential abuses linked to certain AI applications (discrimination, intrusive surveillance, manipulation).<\/p>\n<\/li>\n<\/ol>\n<h3 data-start=\"1145\" data-end=\"1184\">A risk-based approach<\/h3>\n<p data-start=\"1185\" data-end=\"1423\">The AI Act is based on a <strong data-start=\"1209\" data-end=\"1259\">proportionate, risk-based approach<\/strong>. The more dangerous an artificial intelligence system is deemed to be for the safety, health or rights of citizens, the stricter the rules to be respected. <\/p>\n<p data-start=\"1425\" data-end=\"1441\">In concrete terms :<\/p>\n<ul data-start=\"1442\" data-end=\"2200\">\n<li data-start=\"1442\" data-end=\"1641\">\n<p data-start=\"1444\" data-end=\"1641\">Systems that present an <strong data-start=\"1471\" data-end=\"1494\">unacceptable risk<\/strong> (such as real-time facial recognition in public spaces or social scoring inspired by the Chinese model) are purely <strong data-start=\"1625\" data-end=\"1638\">prohibited<\/strong>.<\/p>\n<\/li>\n<li data-start=\"1642\" data-end=\"1902\">\n<p data-start=\"1644\" data-end=\"1902\"><strong data-start=\"1659\" data-end=\"1674\">High-risk<\/strong> systems (e.g. AI medical diagnostics, algorithms used in recruitment, critical infrastructures) are permitted, but subject to <strong data-start=\"1810\" data-end=\"1839\">very strict obligations<\/strong> of transparency, documentation and human supervision.<\/p>\n<\/li>\n<li data-start=\"1903\" data-end=\"2060\">\n<p data-start=\"1905\" data-end=\"2060\"><strong data-start=\"1920\" data-end=\"1937\">Limited-risk<\/strong> systems must comply with transparency obligations (for example, informing users that they are interacting with a chatbot).<\/p>\n<\/li>\n<li data-start=\"2061\" data-end=\"2200\">\n<p data-start=\"2063\" data-end=\"2200\"><strong data-start=\"2078\" data-end=\"2096\">Minimal-risk<\/strong> systems (video games, spam filters, office tools) are not subject to any particular constraints.<\/p>\n<\/li>\n<\/ul>\n<h3 data-start=\"2202\" data-end=\"2239\">A very broad field of application<\/h3>\n<p data-start=\"2240\" data-end=\"2437\">The IA Act does not just concern <strong data-start=\"2280\" data-end=\"2313\">large technology groups<\/strong>. It applies to <strong data-start=\"2331\" data-end=\"2409\">any organization developing or using artificial intelligence<\/strong> in the European Union: <\/p>\n<ul data-start=\"2438\" data-end=\"2770\">\n<li data-start=\"2438\" data-end=\"2530\">\n<p data-start=\"2440\" data-end=\"2530\"><strong data-start=\"2440\" data-end=\"2483\">Major international companies<\/strong> offering AI-based solutions.<\/p>\n<\/li>\n<li data-start=\"2531\" data-end=\"2650\">\n<p data-start=\"2533\" data-end=\"2650\"><strong data-start=\"2533\" data-end=\"2568\">European SMEs and startups<\/strong>, who will have to integrate these new rules into the design of their products.<\/p>\n<\/li>\n<li data-start=\"2651\" data-end=\"2770\">\n<p data-start=\"2653\" data-end=\"2770\"><strong data-start=\"2653\" data-end=\"2676\">Public players<\/strong> (administrations, hospitals, local authorities) using AI systems in their services.<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"2772\" data-end=\"3000\">The ambition is to lay the foundations for <strong data-start=\"2812\" data-end=\"2831\">trustworthy AI<\/strong> that is both innovative and respectful of human rights, by positioning Europe as a <strong data-start=\"2923\" data-end=\"2966\">world leader in the ethical regulation<\/strong> of artificial intelligence.<\/p>\n<hr data-start=\"1590\" data-end=\"1593\">\n<h2 data-start=\"191\" data-end=\"243\">The four levels of risk defined by the IA Act<\/h2>\n<p data-start=\"245\" data-end=\"578\">The AI Act is based on a <strong data-start=\"269\" data-end=\"359\">classification of artificial intelligence systems into four risk categories<\/strong>. This approach makes it possible to tailor legal obligations according to the potential danger posed by AI to citizens and society. The higher the risk, the stricter the regulatory requirements.  <\/p>\n<h3 data-start=\"580\" data-end=\"632\">1. Unacceptable risk: prohibited uses<\/h3>\n<p data-start=\"633\" data-end=\"887\">Some AI applications are deemed to be <strong data-start=\"673\" data-end=\"711\">contrary to fundamental rights<\/strong> and are therefore <strong data-start=\"725\" data-end=\"752\">formally prohibited<\/strong> in the European Union.<br data-start=\"777\" data-end=\"780\">These practices represent a direct threat to the freedom, dignity and privacy of individuals.<\/p>\n<p data-start=\"889\" data-end=\"914\"><strong data-start=\"889\" data-end=\"912\">Concrete examples:<\/strong><\/p>\n<ul data-start=\"915\" data-end=\"1421\">\n<li data-start=\"915\" data-end=\"1050\">\n<p data-start=\"917\" data-end=\"1050\"><strong data-start=\"920\" data-end=\"971\">Real-time mass biometric surveillance<\/strong>, notably via facial recognition in public places.<\/p>\n<\/li>\n<li data-start=\"1051\" data-end=\"1224\">\n<p data-start=\"1053\" data-end=\"1224\"><strong data-start=\"1069\" data-end=\"1089\">Social<\/strong> scoring systems, such as those used in China, which classify individuals according to their behavior or creditworthiness.<\/p>\n<\/li>\n<li data-start=\"1225\" data-end=\"1421\">\n<p data-start=\"1227\" data-end=\"1421\">Large-scale <strong data-start=\"1230\" data-end=\"1279\">behavioral or psychological manipulation<\/strong>, aimed at exploiting people&#8217;s vulnerabilities (for example, targeting children with manipulative advertising).<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"1423\" data-end=\"1584\">The ban on these uses positions Europe as a pioneer in the <strong data-start=\"1508\" data-end=\"1581\">protection of individual liberties in the face of technological excesses<\/strong>.<\/p>\n<hr data-start=\"1586\" data-end=\"1589\">\n<h3 data-start=\"1591\" data-end=\"1646\">2. High-risk: strictly controlled uses<\/h3>\n<p data-start=\"1647\" data-end=\"1936\">So-called &#8221; <strong data-start=\"1674\" data-end=\"1689\">high-risk<\/strong> &#8221; AI systems are authorized, but are subject to <strong data-start=\"1733\" data-end=\"1756\">rigorous controls<\/strong>. These uses are considered essential in certain sectors (health, transport, education&#8230;), but can have serious consequences in the event of failure or bias. <\/p>\n<p data-start=\"1938\" data-end=\"1963\"><strong data-start=\"1938\" data-end=\"1961\">Concrete examples:<\/strong><\/p>\n<ul data-start=\"1964\" data-end=\"2430\">\n<li data-start=\"1964\" data-end=\"2103\">\n<p data-start=\"1966\" data-end=\"2103\">AI used in <strong data-start=\"1988\" data-end=\"2041\">recruitment and human resources management<\/strong>, where a biased algorithm can lead to discrimination.<\/p>\n<\/li>\n<li data-start=\"2104\" data-end=\"2267\">\n<p data-start=\"2106\" data-end=\"2267\"><strong data-start=\"2110\" data-end=\"2144\">Medical diagnostic systems<\/strong> supported by artificial intelligence, which must guarantee maximum reliability to protect patients&#8217; health.<\/p>\n<\/li>\n<li data-start=\"2268\" data-end=\"2430\">\n<p data-start=\"2270\" data-end=\"2430\">Algorithms linked to <strong data-start=\"2297\" data-end=\"2326\">critical infrastructures<\/strong> such as transport, energy or security, where an error could have massive consequences.<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"2432\" data-end=\"2484\"><strong data-start=\"2432\" data-end=\"2482\">Main obligations for companies :<\/strong><\/p>\n<ul data-start=\"2485\" data-end=\"2858\">\n<li data-start=\"2485\" data-end=\"2591\">\n<p data-start=\"2487\" data-end=\"2591\"><strong data-start=\"2487\" data-end=\"2515\">Data transparency<\/strong>: the origin and quality of the data used must be documented.<\/p>\n<\/li>\n<li data-start=\"2592\" data-end=\"2719\">\n<p data-start=\"2594\" data-end=\"2719\"><strong data-start=\"2594\" data-end=\"2631\">Detailed technical documentation<\/strong>: each system must be accompanied by a clear description of how it works.<\/p>\n<\/li>\n<li data-start=\"2720\" data-end=\"2858\">\n<p data-start=\"2722\" data-end=\"2858\"><strong data-start=\"2722\" data-end=\"2757\">Compulsory human supervision<\/strong>: humans must remain in the decision-making loop to avoid drift or automatic errors.<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"2860\" data-end=\"3005\">This category is undoubtedly the most restrictive for companies, but it is essential for building <strong data-start=\"2983\" data-end=\"3002\">trusting AI<\/strong>.<\/p>\n<hr data-start=\"3007\" data-end=\"3010\">\n<h3 data-start=\"3012\" data-end=\"3062\">3. Limited risk: mandatory transparency<\/h3>\n<p data-start=\"3063\" data-end=\"3257\"><strong data-start=\"3078\" data-end=\"3095\">Limited-risk<\/strong> systems don&#8217;t have as many technical constraints as high-risk systems, but they do have to comply with rules of <strong data-start=\"3211\" data-end=\"3227\">transparency<\/strong> for users.<\/p>\n<p data-start=\"3259\" data-end=\"3284\"><strong data-start=\"3259\" data-end=\"3282\">Concrete examples:<\/strong><\/p>\n<ul data-start=\"3285\" data-end=\"3586\">\n<li data-start=\"3285\" data-end=\"3406\">\n<p data-start=\"3287\" data-end=\"3406\"><strong data-start=\"3291\" data-end=\"3321\">Customer service chatbots<\/strong>: users must be informed that they are dealing with an AI, not a human.<\/p>\n<\/li>\n<li data-start=\"3407\" data-end=\"3586\">\n<p data-start=\"3409\" data-end=\"3586\"><strong data-start=\"3423\" data-end=\"3448\">Content generation<\/strong> tools (text, image, video) such as <strong data-start=\"3477\" data-end=\"3510\">ChatGPT, DALL-E or MidJourney<\/strong>: they must clearly indicate that content has been produced by an AI.<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"3588\" data-end=\"3752\">The aim is to <strong data-start=\"3609\" data-end=\"3665\">prevent any risk of confusion or manipulation<\/strong>, and to ensure that users can interact with full knowledge of the facts.<\/p>\n<hr data-start=\"3754\" data-end=\"3757\">\n<h3 data-start=\"3759\" data-end=\"3812\">4. Minimal risk: unrestricted use<\/h3>\n<p data-start=\"3813\" data-end=\"4021\">The vast majority of AI applications fall into the <strong data-start=\"3880\" data-end=\"3898\">minimal risk<\/strong> category. These systems are considered harmless to citizens, and therefore require no special obligations. <\/p>\n<p data-start=\"4023\" data-end=\"4048\"><strong data-start=\"4023\" data-end=\"4046\">Concrete examples:<\/strong><\/p>\n<ul data-start=\"4049\" data-end=\"4302\">\n<li data-start=\"4049\" data-end=\"4127\">\n<p data-start=\"4051\" data-end=\"4127\"><strong data-start=\"4055\" data-end=\"4084\">Video games using AI<\/strong> to enhance the user experience.<\/p>\n<\/li>\n<li data-start=\"4128\" data-end=\"4188\">\n<p data-start=\"4130\" data-end=\"4188\">E-mail <strong data-start=\"4134\" data-end=\"4155\">spam filters<\/strong>.<\/p>\n<\/li>\n<li data-start=\"4189\" data-end=\"4302\">\n<p data-start=\"4191\" data-end=\"4302\">Music and movie recommendation systems, used by platforms such as Spotify and Netflix.<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"4304\" data-end=\"4498\">This category illustrates the AI Act&#8217;s determination <strong data-start=\"4358\" data-end=\"4401\">not to hold back innovation unnecessarily<\/strong>, by giving companies a great deal of freedom for applications considered safe.<\/p>\n<hr data-start=\"2978\" data-end=\"2981\">\n<h2 data-start=\"2983\" data-end=\"3027\">Key obligations for companies<\/h2>\n<p data-start=\"3028\" data-end=\"3129\">The IA Act introduces a strict legal framework for organizations. Here are the major points to remember: <\/p>\n<ol data-start=\"3131\" data-end=\"3726\">\n<li data-start=\"3131\" data-end=\"3251\">\n<p data-start=\"3134\" data-end=\"3251\"><strong data-start=\"3134\" data-end=\"3164\">Mandatory registration<\/strong>: high-risk AI systems will have to be registered in a European database.<\/p>\n<\/li>\n<li data-start=\"3252\" data-end=\"3368\">\n<p data-start=\"3255\" data-end=\"3368\"><strong data-start=\"3255\" data-end=\"3286\">Transparency and traceability<\/strong>: obligation to provide clear information on how AI works.<\/p>\n<\/li>\n<li data-start=\"3369\" data-end=\"3472\">\n<p data-start=\"3372\" data-end=\"3472\"><strong data-start=\"3372\" data-end=\"3395\">Human supervision<\/strong>: the human element must remain in the decision-making loop to avoid any drift.<\/p>\n<\/li>\n<li data-start=\"3473\" data-end=\"3580\">\n<p data-start=\"3476\" data-end=\"3580\"><strong data-start=\"3476\" data-end=\"3504\">Conformity assessment<\/strong>: companies will have to prove that their AI complies with regulations.<\/p>\n<\/li>\n<li data-start=\"3581\" data-end=\"3726\">\n<p data-start=\"3584\" data-end=\"3726\"><strong data-start=\"3584\" data-end=\"3609\">Financial penalties<\/strong>: fines of up to <strong data-start=\"3639\" data-end=\"3698\">35 million euros or 7% of worldwide sales<\/strong> for non-compliance.<\/p>\n<\/li>\n<\/ol>\n<hr data-start=\"3728\" data-end=\"3731\">\n<h2 data-start=\"3733\" data-end=\"3763\">Implementation schedule<\/h2>\n<p data-start=\"3764\" data-end=\"3844\">The regulation was <strong data-start=\"3783\" data-end=\"3801\">adopted in 2024<\/strong>, but its application will be gradual:<\/p>\n<ul data-start=\"3846\" data-end=\"4062\">\n<li data-start=\"3846\" data-end=\"3909\">\n<p data-start=\"3848\" data-end=\"3909\"><strong data-start=\"3848\" data-end=\"3856\">2025<\/strong>: ban on unacceptably risky systems.<\/p>\n<\/li>\n<li data-start=\"3910\" data-end=\"3991\">\n<p data-start=\"3912\" data-end=\"3991\"><strong data-start=\"3912\" data-end=\"3920\">2026<\/strong>: obligations come into force for high-risk systems.<\/p>\n<\/li>\n<li data-start=\"3992\" data-end=\"4062\">\n<p data-start=\"3994\" data-end=\"4062\"><strong data-start=\"3994\" data-end=\"4002\">2027<\/strong>: full roll-out and implementation of the system.<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"4064\" data-end=\"4165\">This gives companies time to <strong data-start=\"4106\" data-end=\"4164\">prepare and bring their AI solutions into compliance<\/strong>.<\/p>\n<hr data-start=\"4167\" data-end=\"4170\">\n<h2 data-start=\"4172\" data-end=\"4211\">What are the implications for companies?<\/h2>\n<p data-start=\"4212\" data-end=\"4277\">The IA Act will have a <strong data-start=\"4229\" data-end=\"4274\">major impact on economic players<\/strong>.<\/p>\n<h3 data-start=\"4279\" data-end=\"4295\">Opportunities<\/h3>\n<ul data-start=\"4296\" data-end=\"4532\">\n<li data-start=\"4296\" data-end=\"4365\">\n<p data-start=\"4298\" data-end=\"4365\"><strong data-start=\"4298\" data-end=\"4330\">Increased<\/strong> user and customer <strong data-start=\"4298\" data-end=\"4330\">confidence<\/strong>.<\/p>\n<\/li>\n<li data-start=\"4366\" data-end=\"4450\">\n<p data-start=\"4368\" data-end=\"4450\"><strong data-start=\"4368\" data-end=\"4406\">Harmonization of rules in Europe<\/strong>, facilitating international deployment.<\/p>\n<\/li>\n<li data-start=\"4451\" data-end=\"4532\">\n<p data-start=\"4453\" data-end=\"4532\"><strong data-start=\"4453\" data-end=\"4476\">Competitive advantage<\/strong> for compliant companies (AI governance).<\/p>\n<\/li>\n<\/ul>\n<h3 data-start=\"4534\" data-end=\"4543\">Challenges<\/h3>\n<ul data-start=\"4544\" data-end=\"4739\">\n<li data-start=\"4544\" data-end=\"4604\">\n<p data-start=\"4546\" data-end=\"4604\"><strong data-start=\"4546\" data-end=\"4576\">High compliance costs<\/strong> for SMEs and startups.<\/p>\n<\/li>\n<li data-start=\"4605\" data-end=\"4664\">\n<p data-start=\"4607\" data-end=\"4664\"><strong data-start=\"4607\" data-end=\"4661\">Increased need for legal and technical expertise<\/strong>.<\/p>\n<\/li>\n<li data-start=\"4665\" data-end=\"4739\">\n<p data-start=\"4667\" data-end=\"4739\"><strong data-start=\"4667\" data-end=\"4702\">Risk of slowing down innovation<\/strong> in the face of red tape.<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"4741\" data-end=\"4744\">\n<h2 data-start=\"4746\" data-end=\"4772\">Generative AI and Act AI<\/h2>\n<p data-start=\"4773\" data-end=\"4872\">Special focus on<strong data-start=\"4808\" data-end=\"4825\">generative AI<\/strong> (ChatGPT, DALL-E, Gemini, Mistral AI, etc.).<\/p>\n<p data-start=\"4874\" data-end=\"4902\">Obligations include :<\/p>\n<ul data-start=\"4903\" data-end=\"5134\">\n<li data-start=\"4903\" data-end=\"4990\">\n<p data-start=\"4905\" data-end=\"4990\"><strong data-start=\"4905\" data-end=\"4962\">Clear indication when content is generated by AI<\/strong> (texts, images, videos).<\/p>\n<\/li>\n<li data-start=\"4991\" data-end=\"5050\">\n<p data-start=\"4993\" data-end=\"5050\"><strong data-start=\"4993\" data-end=\"5037\">Documentation of training data<\/strong> used.<\/p>\n<\/li>\n<li data-start=\"5051\" data-end=\"5134\">\n<p data-start=\"5053\" data-end=\"5134\"><strong data-start=\"5053\" data-end=\"5131\">Measures to prevent the generation of illegal or discriminatory content<\/strong>.<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"5136\" data-end=\"5254\">These rules are designed to <strong data-start=\"5156\" data-end=\"5253\">protect users against deepfakes, misinformation and algorithmic bias<\/strong>.<\/p>\n<hr data-start=\"5256\" data-end=\"5259\">\n<h2 data-start=\"5261\" data-end=\"5301\">The AI Act in European strategy<\/h2>\n<p data-start=\"5302\" data-end=\"5391\">The IA Act is part of Europe&#8217;s desire to <strong data-start=\"5353\" data-end=\"5381\">create a third way<\/strong> between :<\/p>\n<ul data-start=\"5392\" data-end=\"5546\">\n<li data-start=\"5392\" data-end=\"5495\">\n<p data-start=\"5394\" data-end=\"5495\">the United States (rather permissive, with regulation based on corporate self-regulation);<\/p>\n<\/li>\n<li data-start=\"5496\" data-end=\"5546\">\n<p data-start=\"5498\" data-end=\"5546\">China (centralized, highly intrusive model).<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"5548\" data-end=\"5748\">With this regulation, the EU intends to impose an <strong data-start=\"5590\" data-end=\"5622\">ethical and responsible framework<\/strong> that could become a global standard, as was the case with the RGPD (General Data Protection Regulation).<\/p>\n<hr data-start=\"5750\" data-end=\"5753\">\n<h2 data-start=\"5755\" data-end=\"5768\">Conclusion<\/h2>\n<p data-start=\"5769\" data-end=\"5993\">The<strong data-start=\"5771\" data-end=\"5781\">AI Act<\/strong> marks a decisive step in the regulation of artificial intelligence. It imposes a <strong data-start=\"5873\" data-end=\"5896\">balanced approach<\/strong>, seeking to reconcile <strong data-start=\"5920\" data-end=\"5948\">technological innovation<\/strong> with the <strong data-start=\"5952\" data-end=\"5990\">protection of fundamental rights<\/strong>. <\/p>\n<p data-start=\"5995\" data-end=\"6183\">For companies, it represents both a <strong data-start=\"6044\" data-end=\"6052\">challenge<\/strong> (compliance, costs, organization) and an <strong data-start=\"6102\" data-end=\"6117\">opportunity<\/strong> (customer confidence, brand image, European expansion).<\/p>\n<p data-start=\"6185\" data-end=\"6339\">The future of AI in Europe will largely depend on the ability of economic players to <strong data-start=\"6279\" data-end=\"6336\">adapt quickly to this new regulatory framework<\/strong>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The IA Act: everything you need to know about Europe&#8217;s first artificial intelligence regulation Introduction In 2024, the European Union adopted theAI Act (or Artificial Intelligence Act), a major piece of legislation framing the development and use of artificial intelligence (AI) in Europe. It is the world&#8217;s first legislation dedicated to AI, and aims to [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"inline_featured_image":false,"footnotes":""},"categories":[78],"tags":[],"class_list":["post-4732","post","type-post","status-publish","format-standard","hentry","category-artificial-intelligence"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>IA Act | Palmer<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/palmer-consulting.com\/en\/ia-act\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"IA Act | Palmer\" \/>\n<meta property=\"og:description\" content=\"The IA Act: everything you need to know about Europe&#8217;s first artificial intelligence regulation Introduction In 2024, the European Union adopted theAI Act (or Artificial Intelligence Act), a major piece of legislation framing the development and use of artificial intelligence (AI) in Europe. It is the world&#8217;s first legislation dedicated to AI, and aims to [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/palmer-consulting.com\/en\/ia-act\/\" \/>\n<meta property=\"og:site_name\" content=\"Palmer\" \/>\n<meta property=\"article:published_time\" content=\"2025-09-29T19:17:05+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/palmer-consulting.com\/wp-content\/uploads\/2023\/09\/social-graph-palmer.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1200\" \/>\n\t<meta property=\"og:image:height\" content=\"675\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Laurent Zennadi\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Laurent Zennadi\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"7 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/ia-act\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/ia-act\\\/\"},\"author\":{\"name\":\"Laurent Zennadi\",\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/#\\\/schema\\\/person\\\/7ea52877fd35814d1d2f8e6e03daa3ed\"},\"headline\":\"IA Act\",\"datePublished\":\"2025-09-29T19:17:05+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/ia-act\\\/\"},\"wordCount\":1332,\"publisher\":{\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/#organization\"},\"articleSection\":[\"Artificial intelligence\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/ia-act\\\/\",\"url\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/ia-act\\\/\",\"name\":\"IA Act | Palmer\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/#website\"},\"datePublished\":\"2025-09-29T19:17:05+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/ia-act\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/ia-act\\\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/ia-act\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/home\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"IA Act\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/#website\",\"url\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/\",\"name\":\"Palmer\",\"description\":\"Evolve at the speed of change\",\"publisher\":{\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/#organization\",\"name\":\"Palmer\",\"url\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/palmer-consulting.com\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/Palmer_Logo_Full_PenBlue_1x1-2.jpg\",\"contentUrl\":\"https:\\\/\\\/palmer-consulting.com\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/Palmer_Logo_Full_PenBlue_1x1-2.jpg\",\"width\":480,\"height\":480,\"caption\":\"Palmer\"},\"image\":{\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.linkedin.com\\\/company\\\/palmer-consulting\\\/\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/palmer-consulting.com\\\/en\\\/#\\\/schema\\\/person\\\/7ea52877fd35814d1d2f8e6e03daa3ed\",\"name\":\"Laurent Zennadi\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/110e8a99f01ca2c88c3d23656103640dc17e08eac86e26d0617937a6846b4007?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/110e8a99f01ca2c88c3d23656103640dc17e08eac86e26d0617937a6846b4007?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/110e8a99f01ca2c88c3d23656103640dc17e08eac86e26d0617937a6846b4007?s=96&d=mm&r=g\",\"caption\":\"Laurent Zennadi\"}}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"IA Act | Palmer","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/palmer-consulting.com\/en\/ia-act\/","og_locale":"en_US","og_type":"article","og_title":"IA Act | Palmer","og_description":"The IA Act: everything you need to know about Europe&#8217;s first artificial intelligence regulation Introduction In 2024, the European Union adopted theAI Act (or Artificial Intelligence Act), a major piece of legislation framing the development and use of artificial intelligence (AI) in Europe. It is the world&#8217;s first legislation dedicated to AI, and aims to [&hellip;]","og_url":"https:\/\/palmer-consulting.com\/en\/ia-act\/","og_site_name":"Palmer","article_published_time":"2025-09-29T19:17:05+00:00","og_image":[{"width":1200,"height":675,"url":"https:\/\/palmer-consulting.com\/wp-content\/uploads\/2023\/09\/social-graph-palmer.png","type":"image\/png"}],"author":"Laurent Zennadi","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Laurent Zennadi","Est. reading time":"7 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/palmer-consulting.com\/en\/ia-act\/#article","isPartOf":{"@id":"https:\/\/palmer-consulting.com\/en\/ia-act\/"},"author":{"name":"Laurent Zennadi","@id":"https:\/\/palmer-consulting.com\/en\/#\/schema\/person\/7ea52877fd35814d1d2f8e6e03daa3ed"},"headline":"IA Act","datePublished":"2025-09-29T19:17:05+00:00","mainEntityOfPage":{"@id":"https:\/\/palmer-consulting.com\/en\/ia-act\/"},"wordCount":1332,"publisher":{"@id":"https:\/\/palmer-consulting.com\/en\/#organization"},"articleSection":["Artificial intelligence"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/palmer-consulting.com\/en\/ia-act\/","url":"https:\/\/palmer-consulting.com\/en\/ia-act\/","name":"IA Act | Palmer","isPartOf":{"@id":"https:\/\/palmer-consulting.com\/en\/#website"},"datePublished":"2025-09-29T19:17:05+00:00","breadcrumb":{"@id":"https:\/\/palmer-consulting.com\/en\/ia-act\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/palmer-consulting.com\/en\/ia-act\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/palmer-consulting.com\/en\/ia-act\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/palmer-consulting.com\/en\/home\/"},{"@type":"ListItem","position":2,"name":"IA Act"}]},{"@type":"WebSite","@id":"https:\/\/palmer-consulting.com\/en\/#website","url":"https:\/\/palmer-consulting.com\/en\/","name":"Palmer","description":"Evolve at the speed of change","publisher":{"@id":"https:\/\/palmer-consulting.com\/en\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/palmer-consulting.com\/en\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/palmer-consulting.com\/en\/#organization","name":"Palmer","url":"https:\/\/palmer-consulting.com\/en\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/palmer-consulting.com\/en\/#\/schema\/logo\/image\/","url":"https:\/\/palmer-consulting.com\/wp-content\/uploads\/2023\/08\/Palmer_Logo_Full_PenBlue_1x1-2.jpg","contentUrl":"https:\/\/palmer-consulting.com\/wp-content\/uploads\/2023\/08\/Palmer_Logo_Full_PenBlue_1x1-2.jpg","width":480,"height":480,"caption":"Palmer"},"image":{"@id":"https:\/\/palmer-consulting.com\/en\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.linkedin.com\/company\/palmer-consulting\/"]},{"@type":"Person","@id":"https:\/\/palmer-consulting.com\/en\/#\/schema\/person\/7ea52877fd35814d1d2f8e6e03daa3ed","name":"Laurent Zennadi","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/110e8a99f01ca2c88c3d23656103640dc17e08eac86e26d0617937a6846b4007?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/110e8a99f01ca2c88c3d23656103640dc17e08eac86e26d0617937a6846b4007?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/110e8a99f01ca2c88c3d23656103640dc17e08eac86e26d0617937a6846b4007?s=96&d=mm&r=g","caption":"Laurent Zennadi"}}]}},"_links":{"self":[{"href":"https:\/\/palmer-consulting.com\/en\/wp-json\/wp\/v2\/posts\/4732","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/palmer-consulting.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/palmer-consulting.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/palmer-consulting.com\/en\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/palmer-consulting.com\/en\/wp-json\/wp\/v2\/comments?post=4732"}],"version-history":[{"count":0,"href":"https:\/\/palmer-consulting.com\/en\/wp-json\/wp\/v2\/posts\/4732\/revisions"}],"wp:attachment":[{"href":"https:\/\/palmer-consulting.com\/en\/wp-json\/wp\/v2\/media?parent=4732"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/palmer-consulting.com\/en\/wp-json\/wp\/v2\/categories?post=4732"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/palmer-consulting.com\/en\/wp-json\/wp\/v2\/tags?post=4732"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}