Change search
Link to record
Permanent link

Direct link
Colonna, Liane
Publications (10 of 37) Show all publications
Colonna, L. (2025). The end of open source?: Regulating open source under the cyber resilience act and the new product liability directive. The Computer Law and Security Review, 56, Article ID 106105.
Open this publication in new window or tab >>The end of open source?: Regulating open source under the cyber resilience act and the new product liability directive
2025 (English)In: The Computer Law and Security Review, ISSN 0267-3649, Vol. 56, article id 106105Article in journal (Refereed) Published
Abstract [en]

Rooted in idealism, the open-source model leverages collaborative intelligence to drive innovation, leading to major benefits for both industry and society. As open-source software (OSS) plays an increasingly central role in driving the digitalization of society, policymakers are examining the interactions between upstream open-source communities and downstream manufacturers. They aim to leverage the benefits of OSS, such as performance enhancements and adaptability across diverse domains, while ensuring software security and accountability. The regulatory landscape is on the brink of a major transformation with the recent adoption of both the Cyber Resilience Act (CRA) as well as the Product Liability Directive (PLD), raising concerns that these laws could threaten the future of OSS.

This paper investigates how the CRA and the PDL regulate OSS, specifically exploring the scope of exemptions found in the laws. It further explores how OSS practices might adapt to the evolving regulatory landscape, focusing on the importance of documentation practices to support compliance obligations, thereby ensuring OSS's continued relevance and viability. It concludes that due diligence requirements mandate a thorough assessment of OSS components to ensure their safety for integration into commercial products and services. Documentation practices like security attestations, Software Bill of Materials (SBOMs), data cards and model cards will play an increasingly important role in the software supply chain to ensure that downstream entities can meet their obligations under these new legal frameworks.

National Category
Law
Identifiers
urn:nbn:se:su:diva-239444 (URN)10.1016/j.clsr.2024.106105 (DOI)2-s2.0-85213216046 (Scopus ID)
Funder
Wallenberg Foundations
Available from: 2025-02-12 Created: 2025-02-12 Last updated: 2025-02-12Bibliographically approved
Colonna, L. (2024). The AI Act’s Research Exemption: A Mechanism for Regulatory Arbitrage?. In: Andreas Moberg; Eduardo Gill-Pedro (Ed.), The Yearbook of Socio-Economic Constitutions: Law and the Governance of Artificial Intelligence (pp. 51-93). Springer
Open this publication in new window or tab >>The AI Act’s Research Exemption: A Mechanism for Regulatory Arbitrage?
2024 (English)In: The Yearbook of Socio-Economic Constitutions: Law and the Governance of Artificial Intelligence / [ed] Andreas Moberg; Eduardo Gill-Pedro, Springer, 2024, p. 51-93Chapter in book (Refereed)
Abstract [en]

This paper argues that by failing to acknowledge the complexity of modern research practices that are shifting from a single discipline to multiple disciplines involving many entities, some public, some private, the proposed AI Act creates mechanisms for regulatory arbitrage. The article begins with a semantic analysis of the concept of research from a legal perspective. It then explains how the proposed AI Act addresses the concept of research by examining the research exemption that is set forward in the forthcoming law as it currently exists. After providing an overview of the proposed law, the paper explores the research exemption to highlight whether there are any gaps, ambiguities, or contradictions in the law that may be exploited by either public or private actors seeking to use the exemption as a shield to avoid compliance with duties imposed under the law.

To address whether the research exemption reflects a coherent legal rule, it is considered from five different perspectives. The paper begins by examining the extent to which the research exemption applies to private or commercial entities that may not pursue research in a benevolent manner to solve societal problems, but nevertheless contribute to innovation and economic growth within the EU. Next, the paper explores how the exemption applies to research that takes place within academia but is on the path to commercialization. The paper goes on to consider the situation where academic researchers invoke the exemption and then go on to provide the AI they develop to their employing institutions or other public bodies for no cost. Fourth, the paper inspects how the exemption functions when researchers build high-risk or prohibited AI, publish their findings, or share them via an open-source platform, and other actors copy the AI. Finally, the paper considers how the exemption applies to research that takes place “in the wild” or in regulatory sandboxes.

Place, publisher, year, edition, pages
Springer, 2024
Series
YSEC Yearbook of Socio-Economic Constitutions ; 2023
National Category
Law
Identifiers
urn:nbn:se:su:diva-226328 (URN)10.1007/16495_2023_59 (DOI)2-s2.0-86000572331 (Scopus ID)978-3-031-55831-3 (ISBN)978-3-031-55832-0 (ISBN)
Available from: 2024-02-07 Created: 2024-02-07 Last updated: 2025-06-02Bibliographically approved
Colonna, L. (2023). Exploring the Relationship Between Article 22 of the General Data Protection Regulation and Article 14 of the Proposed AI Act: Some Preliminary Observations and Critical Reflections. In: Martin Brinnen; Cecilia Magnusson Sjöberg; David Törngren; Daniel Westman; Sören Öman (Ed.), Dataskyddet 50 År – Historia, Aktuella problem och Framtid: (pp. 443-465). Visby: Eddy.se
Open this publication in new window or tab >>Exploring the Relationship Between Article 22 of the General Data Protection Regulation and Article 14 of the Proposed AI Act: Some Preliminary Observations and Critical Reflections
2023 (English)In: Dataskyddet 50 År – Historia, Aktuella problem och Framtid / [ed] Martin Brinnen; Cecilia Magnusson Sjöberg; David Törngren; Daniel Westman; Sören Öman, Visby: Eddy.se , 2023, p. 443-465Chapter in book (Other academic)
Place, publisher, year, edition, pages
Visby: Eddy.se, 2023
National Category
Law (excluding Law and Society)
Identifiers
urn:nbn:se:su:diva-226327 (URN)9789189840027 (ISBN)
Available from: 2024-02-07 Created: 2024-02-07 Last updated: 2024-02-08Bibliographically approved
Wilkowska, W., Offermann, J., Colonna, L., Florez-Revuelta, F., Climent-Pérez, P., Mihailidis, A., . . . Ziefle, M. (2023). Interdisciplinary perspectives on privacy awareness in lifelogging technology development. Journal of Ambient Intelligence and Humanized Computing, 14(3), 2291-2312
Open this publication in new window or tab >>Interdisciplinary perspectives on privacy awareness in lifelogging technology development
Show others...
2023 (English)In: Journal of Ambient Intelligence and Humanized Computing, ISSN 1868-5137, E-ISSN 1868-5145, Vol. 14, no 3, p. 2291-2312Article in journal (Refereed) Published
Abstract [en]

Population aging resulting from demographic changes requires some challenging decisions and necessary steps to be taken by different stakeholders to manage current and future demand for assistance and support. The consequences of population aging can be mitigated to some extent by assisting technologies that can support the autonomous living of older individuals and persons in need of care in their private environments as long as possible. A variety of technical solutions are already available on the market, but privacy protection is a serious, often neglected, issue when using such (assisting) technology. Thus, privacy needs to be thoroughly taken under consideration in this context. In a three-year project PAAL (‘Privacy-Aware and Acceptable Lifelogging Services for Older and Frail People’), researchers from different disciplines, such as law, rehabilitation, human-computer interaction, and computer science, investigated the phenomenon of privacy when using assistive lifelogging technologies. In concrete terms, the concept of Privacy by Design was realized using two exemplary lifelogging applications in private and professional environments. A user-centered empirical approach was applied to the lifelogging technologies, investigating the perceptions and attitudes of (older) users with different health-related and biographical profiles. The knowledge gained through the interdisciplinary collaboration can improve the implementation and optimization of assistive applications. In this paper, partners of the PAAL project present insights gained from their cross-national, interdisciplinary work regarding privacy-aware and acceptable lifelogging technologies. 

Keywords
Lifelogging applications, Privacy, Acceptance, Interdisciplinary project
National Category
Other Social Sciences
Identifiers
urn:nbn:se:su:diva-215107 (URN)10.1007/s12652-022-04486-5 (DOI)2-s2.0-85143671280 (Scopus ID)
Available from: 2023-02-28 Created: 2023-02-28 Last updated: 2023-03-08Bibliographically approved
Colonna, L. (2023). Teachers in the loop? An analysis of automatic assessment systems under Article 22 GDPR. International Data Privacy Law, 14(1), 3-18
Open this publication in new window or tab >>Teachers in the loop? An analysis of automatic assessment systems under Article 22 GDPR
2023 (English)In: International Data Privacy Law, ISSN 2044-3994, E-ISSN 2044-4001, Vol. 14, no 1, p. 3-18Article in journal (Refereed) Published
Abstract [en]

Key Points

  • This article argues that while there is great promise in the everyday automation of higher education to create benefits for students, efficiencies for instructors, and cost savings for institutions, it is important to critically consider how AI-based assessment will transform the role of teachers and the relationship between teachers and students.
  • The focus of the work is on exploring whether and to what extent the requirements set forth in Article 22 of the General Data Protection Regulation (GDPR) apply within the context of AI-based automatic assessment systems, in particular the legal obligation to ensure that a teacher remains in the loop, for example being capable of overseeing and overriding decisions when necessary.
  • Educational judgments involving automatic assessments frequently occur in a complicated decision-making environment that is framed by institutional processes which are multi-step, hierarchical, and bureaucratic. This complexity makes it challenging to determine whether the output of an AI-based automatic assessment system represents an ‘individual decision’ about a data subject within the meaning of Article 22.
  • It is also unclear whether AI-based assessments involve decisions based ‘solely’ on automatic processing or whether teachers provide decisional support, excluding the application of Article 22. According to recent enforcement decisions, human oversight is entangled with institutional procedures and safeguards as well as system design.
National Category
Law (excluding Law and Society) Computer and Information Sciences
Identifiers
urn:nbn:se:su:diva-225530 (URN)10.1093/idpl/ipad024 (DOI)001114797200001 ()2-s2.0-85188794316 (Scopus ID)
Available from: 2024-01-17 Created: 2024-01-17 Last updated: 2024-11-14Bibliographically approved
Colonna, L. (2022). Addressing the Responsibility Gap in Data Protection by Design: Towards a More Future-oriented, Relational, and Distributed Approach. Tilburg Law Review, 27(1), 1-21
Open this publication in new window or tab >>Addressing the Responsibility Gap in Data Protection by Design: Towards a More Future-oriented, Relational, and Distributed Approach
2022 (English)In: Tilburg Law Review, ISSN 2211-0046, Vol. 27, no 1, p. 1-21Article in journal (Refereed) Published
Abstract [en]

This paper explores the extent to which technology providers are responsible to end users for embedding data protection rules in the AI systems they design and develop, so as to safeguard the fundamental rights to privacy and data protection. The main argument set forth is that a relational rationale, requiring a broader range of actors in the supply chain to share legal responsibility for Data Protection by Design (DPbD) is better suited to address infringements to these fundamental rights than the current model that assigns responsibility mainly to the data controller or data processor. Reconceptualizing the law in a more future-oriented, relational, and distributed way would make it possible to adapt legal rules – including those within the GDPR and the continuously evolving EU acquis – to the complex reality of technology development, at least partly addressing the responsibility gap in DPbD.

A future-oriented conception of responsibility would require technology providers to adopt more proactive approaches to DPbD, even where they are unlikely to qualify as a controller. A relational approach to DPbD would require technology providers to bear greater responsibilities to those individuals or groups that are affected by their design choices. A distributed approach to DPbD would allow for downstream actors in the supply chain to bear part of the legal responsibility for DPbD by relying on legal requirements that are applicable to various actors in the supply chain supporting DPbD such as those found in contract law, liability law, and the emerging EU acquis governing AI, data, and information security.

Keywords
Data Protection by Design, technology providers, GDPR, AI Act, responsibility
National Category
Law (excluding Law and Society)
Identifiers
urn:nbn:se:su:diva-215106 (URN)10.5334/tilr.274 (DOI)001000908400001 ()2-s2.0-85151292444 (Scopus ID)
Available from: 2023-02-28 Created: 2023-02-28 Last updated: 2024-05-24Bibliographically approved
Colonna, L. (2022). Artificial Intelligence in Higher Education: Towards a More Relational Approach. Journal of Regulatory Compliance, VIII, 18-54
Open this publication in new window or tab >>Artificial Intelligence in Higher Education: Towards a More Relational Approach
2022 (English)In: Journal of Regulatory Compliance, ISSN 2831-3062, Vol. VIII, p. 18-54Article in journal (Refereed) Published
Abstract [en]

To contribute to the emerging discipline of Responsible Artificial Intelligence (AI), this paper seeks to determine in more detail what responsibility means within the context of the deployment of AI in the Higher Education (HE) context. More, specifically, it seeks to disentangle the boundaries of legal responsibilities within a complex system of humans and technology to understand more clearly who is responsible and for what under the law when it comes to use of facial recognition technology (FRT) in this context. The focus of the paper is on examining the critical role and distinct nature of Ed Tech in providing FRT to the HE. Apply relational ethics theory, it asks what the legal obligations of Ed Tech product and service developers (private organizations) are in relation to the universities (public and private authorities involved in teaching and research), teachers, students and other stakeholders who utilize these AI-driven tools. 

National Category
Law
Identifiers
urn:nbn:se:su:diva-205078 (URN)
Available from: 2022-05-29 Created: 2022-05-29 Last updated: 2022-05-30Bibliographically approved
Colonna, L., Dignum, V., Filatova-Bilous, N., Friberg, S., Haynie-Lavelle, J., Magnusson Sjöberg, C., . . . Tucker, J. (2022). Community Reference Meeting: Challenges and Opportunities of Regulating AI. WASP-HS
Open this publication in new window or tab >>Community Reference Meeting: Challenges and Opportunities of Regulating AI
Show others...
2022 (English)Report (Other academic)
Place, publisher, year, edition, pages
WASP-HS, 2022. p. 10
National Category
Law (excluding Law and Society)
Identifiers
urn:nbn:se:su:diva-215108 (URN)
Available from: 2023-02-28 Created: 2023-02-28 Last updated: 2023-03-08Bibliographically approved
Colonna, L. (2022). Implementing Data Protection by Design in the Ed Tech Context: What is the Role of Technology Providers?. Journal of Law, Technology & the Internet (JOLTI), 13(1), 84-106
Open this publication in new window or tab >>Implementing Data Protection by Design in the Ed Tech Context: What is the Role of Technology Providers?
2022 (English)In: Journal of Law, Technology & the Internet (JOLTI), ISSN 1949-6451, Vol. 13, no 1, p. 84-106Article in journal (Refereed) Published
Abstract [en]

This article explores the specific roles and responsibilities of technology providers when it comes to implementing Data Protection by Design (“DPbD”) and Data Protection by Default (“DPbDf”). As an example, it looks at the Education Technology (“Ed Tech”) sector and the complexities of the supply chains that exist therein to highlight that, in addition to the Higher Education (“HE”) institutions that procure products and services for advancing teaching and learning, Ed Tech vendors may also have responsibility and liability for the processing of student’s personal data. Ultimately, this paper asks whether there are any legal gaps, ambiguities, or normative conflicts to the extent that technology providers can have responsibility in contemporary data processing activities yet escape potential liability where it concerns issues of General Data Protection Regulation (“GDPR”) compliance.

This paper argues that there is befuddlement concerning the determination of which parties are responsible for meeting DPbD and DPbDf obligations, as well as with regards to the extent of this responsibility. In some cases, an Ed Tech provider is a controller or processor in practice together with a HE institution, yet, in others it, may not have any legal responsibility to support the development of privacy and data-protection preserving systems, notwithstanding the fact it might be much more knowledgeable than a HE institution that has procured the Ed Tech product or service about the state-of-the art of the technology. Even in cases where it is clear that an Ed Tech provider does have responsibility as a controller or processor, it is unclear how it should share DPbD obligations and coordinate actions with HE

institutions, especially when the Ed Tech supplier may only be involved in a limited way or at a minor phase in the processing of student data. There is an urgent need to recognize the complex, interdependent, and nonlinear context of contemporary data processing where there exists many different controllers, processors, and other actors, processing personal data in different geographical locations and at different points in time for both central and peripheral purposes. Likewise, the complexity of the supply of software must also be emphasized, particularly in contexts such as the supply of educational technology where technology providers can play a key role in the preservation of privacy and data protection rights but may only have a tangential link to the universities that ultimately use their products and services. There is also a need for a more dynamic approach of considering responsibility regarding DPbD. Instead of thinking about responsibilities in terms of “purpose” and “means” the law should shift towards a focus on powers and capacities. The law should also clarify whether technology providers must notify controllers about changes to the state-of-the-art and, if so, to what extent.

National Category
Law
Research subject
Law and Information Technology
Identifiers
urn:nbn:se:su:diva-204600 (URN)
Available from: 2022-05-12 Created: 2022-05-12 Last updated: 2022-05-13Bibliographically approved
Colonna, L. (2022). Legal Implications of Using AI as an Exam Invigilator. In: Liane Colonna; Stanley Greenstein (Ed.), 2020-2021 Nordic Yearbook: Law in the Era of Artificial Intelligence (pp. 13-46). The Swedish Law and Informatics Research Institute
Open this publication in new window or tab >>Legal Implications of Using AI as an Exam Invigilator
2022 (English)In: 2020-2021 Nordic Yearbook: Law in the Era of Artificial Intelligence / [ed] Liane Colonna; Stanley Greenstein, The Swedish Law and Informatics Research Institute , 2022, p. 13-46Chapter in book (Refereed)
Place, publisher, year, edition, pages
The Swedish Law and Informatics Research Institute, 2022
National Category
Law
Research subject
Law and Information Technology
Identifiers
urn:nbn:se:su:diva-204598 (URN)10.53292/208f5901.adcd9489 (DOI)978-91-8892-964-8 (ISBN)
Available from: 2022-05-12 Created: 2022-05-12 Last updated: 2023-03-08Bibliographically approved
Organisations

Search in DiVA

Show all publications