Agefi Luxembourg - février 2025

Février 2025 33 AGEFI Luxembourg Fonds d’investissement ByDr. SebastiaanNielsHOOGHIEMSTRA* R egulation (EU) 2024/1689 (the “AIAct”)was published in the EU’sOfficial Journal on 12 July 2024. TheAIAct lays downharmonized rules for the development, placing on the market, putting into service, and the use of artificial intelligence (“AI”) in the EU. Its goal is topromote the benefits ofAI, such as innovation, whilst en- suring a high level of protec- tion in terms of fundamental rights. This contributionpro- vides a preliminary analysis of the key aspects of theAIAct that are relevant for EUfunds and theirmanagers. Scope of theAIAct TheAIAct, amongst others, regulatesAI systems, as wellas“providers”and“deployers”ofthesesystems andmodels. AI-Models&General-purposeAIModels The AI Act does not apply to all systems, but only to those systems that fulfil the definition of an “AI system” within the meaning of Article 3(1) AI Act. Thedefinitionof anAI systemis thereforekey toun- derstanding the scope of application of the AI Act. Article 3 (1)AIAct defines anAI systemas follows: “‘AI system’ means a machine-based system that is de- signed to operatewithvarying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it re- ceives, how to generate outputs such as predictions, con- tent, recommendations, or decisions that can influence physical or virtual environments” . ConsideringthewidevarietyofAIsystemsandwith view to future developments, Article 96(1)(f)AIAct requires the European Commission (“ EC ”) to de- velop guidelines on the application of the definition of anAI system, as set out inArticle 3(1) AI Act. On 6 February 2025, the EC published the relevant guidelines to help interpreting AI systems. The AI Act also contains various requirementswith respect toproviders of general-purposeAImodels/systems. Although it could bewell the case that general-pur- poseAI models will be used by fundmanagers, for example, for a chatbot, it is rather unlikely that they will be “providers” of such models. Therefore, this contributionwillnotfurtherdiscussgeneral-purpose AI systems/models. Providers In short, theAIAct defines “providers” as, amongst others, natural/legal persons that developa (general- purpose) AI system or develop a (general-purpose) AI system/model andplaces it on themarket orputs the AI system into service under its own name or trademark. In the context of funds, fund managers could be qualifying as a “provider” if, for instance, theyareloan-originatingmanagersthatdevelopand use an own AI system for assessing the creditwor- thiness of borrowers when originating loans. Fund managers would also qualify as such if they are using an AI system for that purpose that has been developed by a third-party but is being used by the fundmanager under its own name or trademark. It is also tobenoted that, even if a fundmanager is de- veloping an AI system itself, it would, at the same time, qualify as a “deployer”under theAIAct, if it is “using” anAI system. Ingeneral,however,fundmanagerswillmostlyqual- ify as “deployers” and the AI Act provisions related to “providers” will, in practice, most likely be more relevantfortheportfoliocompaniesofthefundsman- agedbyfundmanagersthanforfundmanagersitself. Deployers “Deployers” within the meaning of theAI Act are, amongst others, natural/legal persons using an AI systemunder its authority exceptwhere theAI sys- tem is used in the course of a personal non-profes- sional activity. Fund managers are, for example, deployers if they are purchasing an AI system de- velopedby a third-party toverify IDdocuments for AML/CFT purposes. The fine line between being a “provider” and a “deployer” is whether or not the AI systemhas beendevelopedby a third-party, but is being used by the fund manager under its own name or trademark. In the latter case, the fund manager could qualify as a “deployer” and “provider” at the same time. Extraterritorial effect The AI Act also applies to both EU and non-EU providers placingon themarket or putting into serv- iceAI systems or placing on themarket general-pur- pose AI models in the EU. Furthermore, the AI Act also applies to both EU and non-EU providers and deployers of AI systems, as long as the output pro- ducedby theAI systemis used in the EU. Risk classes TheAI Act introduces a risk-based ap- proach and classifies AI systems into various categories, such as “prohibited AI systems”, “high-risk AI systems” and“low-riskAI systems”. The require- ments imposed under AI Act depend upon the risk assigned to anAI system. ProhibitedAI Practices The AI Act recognizes that there are many beneficial uses of AI, but that it can also be misused and pro- vide novel and powerful tools for manipulative, ex- ploitative and social control practices. Such practices are particularlyharmful and abusive and the AI Act prohibits these because they contradict EU values of respect for human dignity, freedom, equality, democracy and the rule of law and fundamental rights. To this end, the AI Act has listed a number of AI practices that are prohibited. In the context of funds and their managers these include, for example, the placingon themarket, theputting into serviceor the use of an AI system that deploys subliminal tech- niquesbeyond aperson’sconsciousnessorpurpose- fully manipulative or deceptive techniques that impairs the ability of a person tomake an informed decision and poses significant harm to them. Fur- thermore, another examplementioned that couldbe relevant for funds and theirmanagerswould be the prohibitionof theplacingon themarket, theputting into service or the use of anAI system that exploits anyofthevulnerabilitiesofanaturalpersonoraspe- cific groupof personsdue to their age, disabilityor a specific social or economic situation. High-riskAI systems Thevastmajorityoftherequirementsimposedbythe AIAct involves high-riskAI systems.With respect to fund managers, mostly the high-risk AI systems re- ferred to inAnnex IIIAIAct are relevant. Mentioned AI systems, include, for example, remote biometric identificationsystems,AIsystemsintendedtobeused for emotion recognition, as well as AI systems in- tended to be used for the recruitment or selection of natural persons in relation to job applications. With viewtofundsandtheirmanagers,inparticular,those AIsystemswillqualifyas“high-risk”thatinvolveHR mattersandtheidentification/verificationofinvestors when, for example, performingAML/KYCchecks. By derogation, AI systems referred to in Annex III AIAct shall not be considered tobehigh-riskwhere it does not pose a significant risk of harm to the health, safety or fundamental rights of natural per- sons, including by not materially influencing the outcome of decision-making. This is, for example, the case if the AI system is intended to perform a narrowprocedural task. Providers that consider that anyAI system referred to inAnnex IIIAIAct is not high-riskare required todocument their assessment before that system is placed on the market or put into service. Furthermore, before placing on the market or putting into service an AI system for which theprovider has concluded that it is not high- risk, that provider shall register themselves and that systemin the EUdatabase for high-riskAI systems. Low-riskAI systems The AI Act does not define “low-risk AI systems”. Instead, the AI Act refers toAI systems “other than high-riskAI systems”.De facto, theseareAI systems that do not either involve (i) prohibitedAI practices or (ii) any of theAI systems that are listed as “high- risk AI systems” in the AI Act. Although “low-risk AI systems” arenot defined in theAIAct, theAIAct containsseveralprovisionsthroughwhichtheAIOf- fice and Member States are encouraged to facilitate the drawing up of codes of conduct intended to fos- ter the voluntary application of some or all of the mandatory requirements applicable to high-riskAI systems, adapted in light of the intendedpurpose of the systems and the lower risk involved to“low-risk AI systems” that take intoaccount theavailable tech- nical solutions and industry best practices allowing for the application of such requirements. The objec- tive of this is to encourage a larger uptake of ethical and trustworthyAI in the EU. Requirements for “Providers” of high-riskAI systems TheAI Act lays down various requirements that are applicable to “providers” of “high-risk AI systems”. Amongstothers,suchprovidersarerequiredtoestab- lish, implement,documentandmaintainariskman- agement system in relation to high-risk AI systems, provided that a “provider” is subject to internal risk management requirements under EU law, which is the case for fund managers. Furthermore, providers are required to draw up technical documents of a high-risk AI system before that system is placed on themarket orput into service todemonstrate that the high-riskAI system complies with the requirements laiddown in theAIAct. Inaddition,“providers”arerequiredtoautomatically record logs of the lifetime of anAI system to ensure that the “provider” can amend the system if the sys- tem, for example, produces unethical results. Providers also are required to ensure that a high-risk AI systemundergoes the relevant conformity assess- ment procedure laid down in the AI Act prior to it being placed on the market or put into service. All “high-risk AI systems” are required to be registered for the purpose of the EU database for high-risk AI systemsbyproviders,evenifsuchsystemsultimately areassessednottobe“high-risk”.Lastly,“providers” arerequiredtoputinplaceaqualitymanagementsys- temintheformofwrittenpolicies,proceduresandin- structionstoensurethatcorrectionmeasuresaretaken if the AI system is not compliant with the require- ments of theAIAct. Requirements for “Deployers” of high-riskAI systems Deployersofhigh-riskAIsystemsarealsosubjecttoa number of requirements in conformity with the AI Act. In the first place, deployers of high-risk AI sys- tems shall take appropriate technical and organiza- tional measures to ensure they use such systems in accordancewith the instructions for use accompany- ing the systems. Deployers are also required tomon- itor the operation of the high-risk AI system on the basis of the instructions for use and, where relevant, inform providers. Where deployers have reason to consider that the use of the high-riskAI systemin ac- cordance with the instructions may result in that AI system presenting a risk, they are required, without unduedelay,toinformtheproviderordistributorand the relevant market surveillance authority, and shall suspend the use of that system. Deployersofhigh-riskAIsystemsarerequiredtokeep the logs automatically generated by that high-riskAI systemto the extent such logs areunder their control, for a period appropriate to the intended purpose of the high-riskAI system, of at least sixmonths, unless provided otherwise in applicable EUor national law (e.g., the GDPR). Fund managers are required to maintain the logs as part of the documentation kept pursuant to the relevant EU financial service law. Where applicable, deployers of high-riskAI systems shalluseinformationprovidedtodeployersunderthe AI Act to comply with their obligation to carry out a data protection impact assessment under theGDPR. Other relevant requirements include: - the obligation for deployers to assign human over- sighttonaturalpersonswhohavethenecessarycom- petence, training and authority, as well as the necessary support; - the obligation for deployers to inform persons that theyaresubjecttotheuseofahigh-riskAIsystem,in- cluding affected workers, if a high-risk AI system is used at theworkplace; and - the obligation for deployers, in certain instances, to performanassessmentoftheimpactonfundamental rights that the use of such systemmayproduce. Lastly, deployers are required to cooperate with the relevantcompetentauthoritieswithrespecttoanyac- tion those authorities take in relation to the high-risk AI systeminorder to implement theAIAct. The Impact of theAIAct on Managers and their Funds It is clear that AI has the potential of transforming the landscape of financial services, including fund management, by offering unparalleled opportuni- ties for efficiency, innovation and improved deci- sion-making. However, AI also presents inherent risks, includingdata quality issues and a (potential) lack of transparency. In this respect, theAIAct is to bewelcomed. TheAI Act will impact the use of AI by fund managers in relation to AML/KYC, distribution (e.g. robo-advi- sory) and HR. However, the vast majority of de- ployment will take place in relation to portfolio management, compliance, risk management and operational efficiency. In these areas, the deploy- ment of AI will, in most instances, be “low-risk”. Hence, it is to be expected that EU bodies, such as ESMA,will adopt supplementaryguidelines on the use of (low-risk) AI systems for fund managers, as it has alreadydonewith respect to retail investment services in theMiFIDII domain. For sure,more reg- ulatory developments are to be expected in this do- main in the forthcoming years. (*) Dr. SebastiaanHooghiemstra is a senior associate in the investment managementpracticeofLoyens&LoeffLuxembourgandSeniorFellow oftheInternationalCenterforFinancialLaw&GovernanceattheEras- musUniversityRotterdam. The Impact of the EUAIAct on Luxembourg Funds and theirManagers DASH B OA R D ȱȱ AGEFI ȱ L u x em b ourg ȱ 31 Ȭ J an Ȭ 2025 ȱ 29 Ȭ D e c Ȭ 2024 ȱ DIFF ȱ % ȱ ȱ ȱȱ D ow ȱ 30 ȱ ( DJ I) ȱ 44.544 , 66 ȱ 4 2 .544 ,22 ȱ 4 , 70% ȱ ȱ ȱ S & P ȱ 500 ȱ (GSPC) ȱ 6.040 , 53 ȱ 5.881 , 63 ȱ 2, 70% ȱ ȱ ȱ Euro ȱ Stoxx ȱ 50 ȱ 5. 2 86 , 87 ȱ 4.869 ,2 8 ȱ 8 , 58% ȱ ȱ ȱ D A X ȱ (G D A X I) ȱ 2 1.73 2, 05 ȱ 19.909 , 14 ȱ 9 , 16% ȱ ȱ ȱ CAC ȱ 40 ȱ ( F CHI) ȱ 7.950 , 17 ȱ 7.380 , 74 ȱ 7 , 7 2 % ȱ ȱ ȱ F TSE ȱ 100 ȱ ( F TSE) ȱ 8.674 , 00 ȱ 8.173 , 00 ȱ 6 , 13% ȱ ȱ ȱ L ux X ȱ index ȱ 1.356 , 33 ȱ 1.303 , 91 ȱ 4 , 0 2 % ȱ ȱ ȱ Nikkei ȱ 22 5 ȱ (N22 5 ) ȱ 39.57 2, 49 ȱ 39.894 , 54 ȱ Ȭ 0 , 81% ȱ ȱ ȱ Shanghai ȱ (SHC O MP) ȱ 3. 2 50 , 60 ȱ 3.351 , 76 ȱ Ȭ 3 , 0 2 % ȱ ȱ ȱ ȱ ȱ ȱ ȱ ȱ ȱ US ȱ F ed ȱ F unds ȱ R ate ȱȱ 4 , 33% ȱ 4 , 48% ȱ Ȭ 0 , 15% ȱ ȱ ȱ 3 ȱ Month ȱ US ȱ Treasury ȱ R ate ȱ 4 , 31% ȱ 4 , 37% ȱ Ȭ 0 , 06% ȱ ȱ ȱ 5 ȱ Y ear ȱ US ȱ Treasury ȱ R ate ȱ 4 , 36% ȱ 4 , 38% ȱ Ȭ 0 , 0 2 % ȱ ȱ ȱ European ȱ Central ȱ Bank ȱ (ECB) ȱ R efinancing ȱ R ate ȱ 2, 90% ȱ 3 , 15% ȱ Ȭ 0 ,2 5% ȱ ȱ ȱ 5 Ȭ Y ear ȱ Euro z one ȱ Central ȱ Government ȱ Bond ȱ 2, 53% ȱ 2, 49% ȱ 0 , 04% ȱ ȱ ȱ OE C D ȱ G eneral ȱ G o v ernement /GDP ȱ 2022 ȱ 2022 ȱ last ȱ 2022 ȱ pre v ious 2022 ȱ trend ȱ ȱ ȱ J apan ȱ 243% ȱ 2 43% ȱ 0% ȱ ȱ ȱ Greece ȱ 192% ȱ 193% ȱ Ȭ 1% ȱ ȱ ȱ Italy ȱ 145% ȱ 148% ȱ Ȭ 3% ȱ ȱ ȱ USA ȱ 119% ȱ 1 2 0% ȱ Ȭ 1% ȱ ȱ ȱ F rance ȱ 115% ȱ 116% ȱ Ȭ 1% ȱ ȱ ȱ Spain ȱ 114% ȱ 116% ȱ Ȭ 2% ȱ ȱ ȱ Portugal ȱ 114% ȱ 115% ȱ Ȭ 1% ȱ ȱ ȱ O EC D ȱ (Total ȱ 2 0 22) ȱ 110% ȱ 109% ȱ 1% ȱ ȱ ȱ U K ȱ 99% ȱ 105% ȱ Ȭ 6% ȱ ȱ ȱ Belgium ȱ 101% ȱ 104% ȱ Ȭ 3% ȱ ȱ ȱ Canada ȱ 99% ȱ 10 2 % ȱ Ȭ 3% ȱ ȱ ȱ Austria ȱ 80% ȱ 80% ȱ 0% ȱ ȱ ȱ O EC D ȱ (Average ȱ 2 0 22) ȱ 77% ȱ 78% ȱ Ȭ 1% ȱ ȱ ȱ Hungary ȱȱ 77% ȱ 77% ȱ 0% ȱ ȱ ȱ F inland ȱ 75% ȱ 75% ȱ 0% ȱ ȱ ȱ Iceland ȱ 74% ȱ 75% ȱ Ȭ 1% ȱ ȱ ȱ Slovenia ȱ 72% ȱ 7 2 % ȱ 0% ȱ ȱ ȱ Germany ȱ 64% ȱ 65% ȱ Ȭ 1% ȱ ȱ ȱ Slovakia ȱ 64% ȱ 64% ȱ 0% ȱ ȱ ȱ Poland ȱ 59% ȱ 59% ȱ 0% ȱ ȱ ȱ K orea ȱ 46% ȱ 58% ȱ Ȭ 12% ȱ ȱ ȱ New ȱ Z ealand ȱ 57% ȱ 57% ȱ 0% ȱ ȱ ȱ Mexico ȱ 55% ȱ 54% ȱ 1% ȱ ȱ ȱ Netherlands ȱ 53% ȱ 54% ȱ Ȭ 1% ȱ ȱ ȱ Australia ȱ 53% ȱ 53% ȱ 0% ȱ ȱ ȱ L atvia ȱ 54% ȱ 5 2 % ȱ 2% ȱ ȱ ȱ C z echia ȱ 46% ȱ 48% ȱ Ȭ 2% ȱ ȱ ȱ Ireland ȱ 45% ȱ 46% ȱ Ȭ 1% ȱ ȱ ȱ Sweden ȱ 44% ȱ 44% ȱ 0% ȱ ȱ ȱ Chile ȱ 41% ȱ 41% ȱ 0% ȱ ȱ ȱ Norway ȱ 41% ȱ 41% ȱ 0% ȱ ȱ ȱ Swit z erland ȱ 37% ȱ 38% ȱ Ȭ 1% ȱ ȱ ȱ L ithuania ȱ 38% ȱ 38% ȱ 0% ȱ ȱ ȱ Turkiye ȱ 36% ȱ 36% ȱ 0% ȱ ȱ ȱ D enmark ȱ 39% ȱ 35% ȱ 4% ȱ ȱ ȱ L uxembourg ȱȱ 29% ȱ 30% ȱ Ȭ 1% ȱ ȱ ȱ Estonia ȱ 27% ȱ 2 6% ȱ 1% ȱ ȱ ȱ ȱ 31 Ȭ May Ȭ 2025 29 Ȭ D e c Ȭ 2024 ȱ DIFF ȱ % ȱ ȱ ȱ Barrel ȱ ( W est ȱ Texas ȱ Intermediate) ȱ 0 , 4408 ȱ 0 , 4088 ȱ 7 , 84% ȱ € ȱ W est ȱ Texas ȱ Intermediate ȱ (prix ȱ en ȱ euro ȱ par ȱ litre) ȱ Natural ȱ gas: ȱ 1 ȱ m 3= ȱ 0 , 1039 ȱ 0 , 1164 ȱ Ȭ 10 , 70% ȱ € ȱ Natural ȱ Gas, ȱ Henry ȱ Hub Ȭ I ȱ (prix ȱ en ȱ euro ȱ par ȱ m 3 ) ȱ Natural ȱ gas: ȱ 1 M W h = ȱ 10 , 0 2 47 ȱ 11 ,22 55 ȱ Ȭ 10 , 70% ȱ € ȱ Natural ȱ Gas, ȱ Henry ȱ Hub Ȭ I ȱ (prix ȱ en ȱ euro ȱ par ȱ M W h) ȱ Natural ȱ gas: ȱ 1 ȱ MMbtu = ȱ 3 , 0400 ȱ 3 , 6300 ȱ Ȭ 16 ,2 5% ȱ $ ȱ Natural ȱ Gas, ȱ Henry ȱ Hub Ȭ I ȱ (prix ȱ en ȱ $ ȱ par ȱ MMbtu) ȱ ȱ ȱ ȱ ȱ ȱ ȱ Gold: ȱ 1 ȱ K g = ȱ 86.935 , 93 ȱ 76.453 , 87 ȱ 13 , 71% ȱ € ȱ ȱ Gold: ȱ 1 ȱ o z= ȱ 2 .798 , 65 ȱ 2 .6 2 4 , 49 ȱ 6 , 64% ȱ $ ȱ ȱ Silver: ȱ 1 ȱ K g = ȱ 974 , 77 ȱ 840 , 14 ȱ 16 , 03% ȱ € ȱ ȱ Silver: ȱ 1 ȱ o z= ȱ 31 , 38 ȱ 2 8 , 84 ȱ 8 , 81% ȱ $ ȱ ȱ ȱ This ȱ dashboard, ȱ exclusive ȱ to ȱ AGE F I ȱ L uxembourg, ȱ allows ȱ the ȱ reader: ȱ 1° ȱ to ȱ see ȱ the ȱ returns ȱ of ȱ the ȱ main ȱ assets ȱ and ȱ financial ȱ indices ȱ for ȱ the ȱ current ȱ year ȱ 2 ° ȱ to ȱ see ȱ on ȱ one ȱ page ȱ the ȱ main ȱ stock ȱ market ȱ indices ȱ and ȱ interest ȱ rates ȱ 3° ȱ to ȱ know ȱ the ȱ production ȱ cost ȱ of ȱ several ȱ energy ȱ products ȱ in ȱ euros, ȱ to ȱ compare ȱ with ȱ the ȱ retail ȱ price ȱ 4° ȱ to ȱ know ȱ the ȱ price ȱ of ȱ gold ȱ and ȱ silver ȱ in ȱ kilos ȱ and ȱ in ȱ euros . ȱ 5° ȱ to ȱ immediately ȱ see ȱ the ȱ public ȱ date / G D P ȱ for ȱ several ȱ significant ȱ countries ȱ ȱ ȱ

RkJQdWJsaXNoZXIy Nzk5MDI=