New Technologies in International Law / Tymofeyeva, Crhák et al.

personal injury, or damage to the product or other property. 505 The proposed PLD defines a defective product “as a product that does not provide the safety the consumer is entitled to expect, considering all circumstances”. It extends the party liable for harm from the ‘producer’ to the ‘economic operator,’ which is defined as “the manufacturer of a product or component, the provider of a related service, the authorized representative, the importer, the fulfillment service provider or the distributor,” broadening the parties that may be liable for harm occurring to patients due to AI tools applied in their healthcare delivery. 506 Notably, the notion of defect under the PLD only focuses on physical harm, excluding non-tangible harm that may occur, including privacy harm, cybersecurity flaws, or other risks. Thus, developing countries will have to adopt legal and regulatory frameworks that are cognizant of these non-tangible threats to individuals that may occur in the application of AI to promote access to healthcare. Developing nations may adopt national policies that follow the proposed AILD, which seeks to lay uniform rules on the civil liability of owners and users of AI. The AILD complements the PLD and also follows the definition of high-risk in the proposed AI Act, detailing rules on the claimant’s access to evidence of the defendant, allowing (potential) claimants to request access to relevant evidence about a specific high-risk AI system suspected of having caused damage. 507 The proposed AILD enables national courts to oversee and order the defendant’s disclosure and preservation of evidence, and when a defendant fails to comply with court orders relating to the handling of evidence, a presumption of non-compliance with duties of care is presumed. However, the defendant may rebut the presumption by submitting evidence to the contrary. 508 Developing nations may adopt the strict liability approach taken by the EU, which bridges responsibility gaps. However, it is essential to note that currently, many jurisdictions allow strict liability only for civil compensation of losses but not for criminal liability, as punishment under criminal law requires culpability. This may leave a gap in criminal responsibility for harm caused by the application of AI to healthcare services. Typically, States ascribe criminal responsibility compared to civil liability, to punish off enders rather than compensate victims and pursue further penological aims such as retribution or deterrence. 509 Developing nations seeking to apply AI tools to mitigate access to healthcare issues will need to adopt effective legal regulations that balance adequate compensation of injured persons criminal liability in situations that go beyond civil liability while balancing incentives for practical innovation and deployment of AI, to encourage the creation and adoption of AI tools that address access to healthcare issues. These incentives, however, should never come at the cost of unnecessary harm to individuals.

505 Article 6 (1) PLD proposal. 506 Article 7 PLD proposal.

507 Article 3 proposed AI Liability Directive. 508 Article 3(5) proposed AI Liability Directive. 509 Bublitz C and others, ‘Legal Liabilities of BCI-Users: Responsibility Gaps at the Intersection of Mind and Machine?’ (2019) 65 International Journal of Law and Psychiatry 101399.

119

Made with FlippingBook Annual report maker