Change search
Refine search result
1 - 36 of 36
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1. Ake-Kob, Alin
    et al.
    Blazeviciene, Aurelija
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Čartolovni, Anto
    Dantas, Carina
    Fedosov, Anton
    Florez-Revuelta, Francisco
    Fosch-Villaronga, Eduard
    He, Zhicheng
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Klimczuk, Andrzej
    Kuźmicz, Maksymilian
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Lukács, Adrienn
    Lutz, Christoph
    Mekovec, Renata
    Miguel, Cristina
    Mordini, Emilio
    Pajalic, Zada
    Pierscionek, Barbara Krystyna
    Santofimia Romero, Maria Jose
    Salah, Albert Ali
    Sobecki, Andrzej
    Solanas, Agusti
    Tamò-Larrieux, Aurelia
    State of the Art on Ethical, Legal, and Social Issues Linked to Audio- and Video-Based AAL Solutions2021Report (Other academic)
    Abstract [en]

    Ambient assisted living (AAL) technologies are increasingly presented and sold as essential smart additions to daily life and home environments that will radically transform the healthcare and wellness markets of the future. An ethical approach and a thorough understanding of all ethics in surveillance/monitoring architectures are therefore pressing. AAL poses many ethical challenges raising questions that will affect immediate acceptance and long-term usage. Furthermore, ethical issues emerge from social inequalities and their potential exacerbation by AAL, accentuating the existing access gap between high-income countries (HIC) and low and middle-income countries (LMIC). Legal aspects mainly refer to the adherence to existing legal frameworks and cover issues related to product safety, data protection, cybersecurity, intellectual property, and access to data by public, private, and government bodies. Successful privacy-friendly AAL applications are needed, as the pressure to bring Internet of Things (IoT) devices and ones equipped with artificial intelligence (AI) quickly to market cannot overlook the fact that the environments in which AAL will operate are mostly private (e.g., the home). The social issues focus on the impact of AAL technologies before and after their adoption. Future AAL technologies need to consider all aspects of equality such as gender, race, age and social disadvantages and avoid increasing loneliness and isolation among, e.g. older and frail people. Finally, the current power asymmetries between the target and general populations should not be underestimated nor should the discrepant needs and motivations of the target group and those developing and deploying AAL systems. Whilst AAL technologies provide promising solutions for the health and social care challenges, they are not exempt from ethical, legal and social issues (ELSI). A set of ELSI guidelines is needed to integrate these factors at the research and development stage.

     

  • 2. Aleksic, Slavisa
    et al.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Dantas, Carina
    Fedosov, Anton
    Florez-Revuelta, Francisco
    Fosch-Villaronga, Eduard
    Jevremovic, Aleksandar
    Gahbiche Msakniç, Hajer
    Ravi, Siddharth
    Rexha, Blerim
    Tamò-Larrieux, Aurelia
    State of the art in privacy preservation in video data2022Report (Other academic)
    Abstract [en]

    Active and Assisted Living (AAL) technologies and services are a possible solution to address the crucial challenges regarding health and social care resulting from demographic changes and current economic conditions. AAL systems aim to improve quality of life and support independent and healthy living of older and frail people. AAL monitoring systems are composed of networks of sensors (worn by the users or embedded in their environment) processing elements and actuators that analyse the environment and its occupants to extract knowledge and to detect events, such as anomalous behaviours, launch alarms to tele-care centres, or support activities of daily living, among others. Therefore, innovation in AAL can address healthcare and social demands while generating economic opportunities.

    Recently, there has been far-reaching advancements in the development of video-based devices with improved processing capabilities, heightened quality, wireless data transfer, and increased interoperability with Internet of Things (IoT) devices. Computer vision gives the possibility to monitor an environment and report on visual information, which is commonly the most straightforward and human-like way of describing an event, a person, an object, interactions and actions. Therefore, cameras can offer more intelligent solutions for AAL but they may be considered intrusive by some end users.

    The General Data Protection Regulation (GDPR) establishes the obligation for technologies to meet the principles of data protection by design and by default. More specifically, Article 25 of the GDPR requires that organizations must "implement appropriate technical and organizational measures [...] which are designed to implement data protection principles [...] , in an effective manner and to integrate the necessary safeguards into [data] processing.” Thus, AAL solutions must consider privacy-by-design methodologies in order to protect the fundamental rights of those being monitored.

    Different methods have been proposed in the latest years to preserve visual privacy for identity protection. However, in many AAL applications, where mostly only one person would be present (e.g. an older person living alone), user identification might not be an issue; concerns are more related to the disclosure of appearance (e.g. if the person is dressed/naked) and behaviour, what we called bodily privacy. Visual obfuscation techniques, such as image filters, facial de-identification, body abstraction, and gait anonymization, can be employed to protect privacy and agreed upon by the users ensuring they feel comfortable.

    Moreover, it is difficult to ensure a high level of security and privacy during the transmission of video data. If data is transmitted over several network domains using different transmission technologies and protocols, and finally processed at a remote location and stored on a server in a data center, it becomes demanding  to implement and guarantee the highest level of protection over the entire transmission and storage system and for the whole lifetime of the data. The development of video technologies, increase in data rates and processing speeds, wide use of the Internet and cloud computing as well as highly efficient video compression methods have made video encryption even more challenging. Consequently, efficient and robust encryption of multimedia data together with using efficient compression methods are important prerequisites in achieving secure and efficient video transmission and storage.

  • 3.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    A Taxonomy and Classification of Data Mining2013In: SMU Science and Technology Law Review, ISSN 1949-2642, Vol. 16, no 2, p. 309-369Article in journal (Refereed)
    Abstract [en]

    Data is a source of power, which organizations and individuals of everyform are seeking ways to collect, control and capitalize upon.' Even thoughdata is not inherently valuable like gold or cattle, many organizations andindividuals understand, almost instinctively, that there are great possibilitiesin the vast amounts of data available to modern society. Data mining is animportant way to employ data by dynamically processing it through the useof advancing technology.The common usage of the term "data mining" is problematic becausethe term is used so variably that it is beginning to lose meaning.2 The prob-lem is partially due to the breadth and complexity of activities referred to as"data mining." This overuse, especially from the perspective of those lackinga scientific background, creates a befuddlement and alienation of the topic.As such, individuals seem to haphazardly refer to data mining without a gen-uine understanding of what this technology entails.This paper seeks to demystify data mining for lawyers through a clarifi-cation of some of its intricacies and nuances. The goal is to explain how datamining works from a technological perspective in order to lay a foundationfor understanding whether data mining is sufficiently addressed by the law.A central ambition is to look beyond the buzzword and to take a realisticview of the core attributes of data mining. In an effort to understand if thereis a need for new legal models and solutions, particular attention will be paidto exploring whether data mining is a genuinely new concept or whether it isa case of "the emperor's new clothes."

  • 4.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Addressing the Responsibility Gap in Data Protection by Design: Towards a More Future-oriented, Relational, and Distributed Approach2022In: Tilburg Law Review, ISSN 2211-0046, Vol. 27, no 1, p. 1-21Article in journal (Refereed)
    Abstract [en]

    This paper explores the extent to which technology providers are responsible to end users for embedding data protection rules in the AI systems they design and develop, so as to safeguard the fundamental rights to privacy and data protection. The main argument set forth is that a relational rationale, requiring a broader range of actors in the supply chain to share legal responsibility for Data Protection by Design (DPbD) is better suited to address infringements to these fundamental rights than the current model that assigns responsibility mainly to the data controller or data processor. Reconceptualizing the law in a more future-oriented, relational, and distributed way would make it possible to adapt legal rules – including those within the GDPR and the continuously evolving EU acquis – to the complex reality of technology development, at least partly addressing the responsibility gap in DPbD.

    A future-oriented conception of responsibility would require technology providers to adopt more proactive approaches to DPbD, even where they are unlikely to qualify as a controller. A relational approach to DPbD would require technology providers to bear greater responsibilities to those individuals or groups that are affected by their design choices. A distributed approach to DPbD would allow for downstream actors in the supply chain to bear part of the legal responsibility for DPbD by relying on legal requirements that are applicable to various actors in the supply chain supporting DPbD such as those found in contract law, liability law, and the emerging EU acquis governing AI, data, and information security.

  • 5.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Article 4 of the EU Data Protection Directive and the irrelevance of the EU–US Safe Harbor Program?2014In: International Data Privacy Law, ISSN 2044-3994, E-ISSN 2044-4001, Vol. 4, no 3, p. 203-221Article in journal (Refereed)
    Abstract [en]
    • The relationship between the EU–US Safe Harbor Program and the applicable law provisions set forth in the EU Data Protection Directive and the proposed EU Data Protection Regulation requires clarification.

    • A central concern for US companies is that the benefits of enrolment in the EU–US Safe Harbor Program will be undermined by the broad assertions of extraterritorial jurisdiction made by the EU pursuant to Article 4 of the Directive/Article 3 of the proposed Regulation.

    • If the extraterritorial scope of the Directive/Regulation is widely interpreted then many US companies may lose their incentive to join the Safe Harbor Program because the major benefits of joining the Safe Harbor Program—the ability to rely on industry dispute resolution mechanisms, US law to interpret the Principles, and US courts and administrative bodies to hear claims—will be removed.

  • 6.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Artificial Intelligence in Higher Education: Towards a More Relational Approach2022In: Journal of Regulatory Compliance, ISSN 2831-3062, Vol. VIII, p. 18-54Article in journal (Refereed)
    Abstract [en]

    To contribute to the emerging discipline of Responsible Artificial Intelligence (AI), this paper seeks to determine in more detail what responsibility means within the context of the deployment of AI in the Higher Education (HE) context. More, specifically, it seeks to disentangle the boundaries of legal responsibilities within a complex system of humans and technology to understand more clearly who is responsible and for what under the law when it comes to use of facial recognition technology (FRT) in this context. The focus of the paper is on examining the critical role and distinct nature of Ed Tech in providing FRT to the HE. Apply relational ethics theory, it asks what the legal obligations of Ed Tech product and service developers (private organizations) are in relation to the universities (public and private authorities involved in teaching and research), teachers, students and other stakeholders who utilize these AI-driven tools. 

  • 7.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Artificial Intelligence in the Internet of Health Things: Is the Solution to AI Privacy More AI?2021In: Boston University Journal of Science and Technology Law, ISSN 1548-520X, Vol. 27, no 2, p. 312-343Article in journal (Refereed)
  • 8.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Data Mining and Its Paradoxical Relationship to the Purpose Limitation Principle2014In: Reloading Data Protection: Multidisciplinary Insights and Contemporary Challenges / [ed] Serge Gutwirth, Ronald Leenes, Paul De Hert, Dordrecht: Springer, 2014, p. 299-321Chapter in book (Refereed)
    Abstract [en]

    European Union data protection law aims to protect individuals from privacy intrusions through a myriad of procedural tools that reflect a code of fair information principles. These tools empower the individual by giving him/her rights to control the processing of his/her data. One of the problems, however, with the European Union’s current reliance on fair information principles is that these tools are increasingly challenged by the technological reality. And, perhaps nowhere is this more evident than when it comes data mining, which puts the European Union data protection principles to the ultimate test. As early as 1998, commentators have noted that there is quite a paradoxical relation between data mining and some data protection principles.This paper seeks to explore this so-called paradoxical relationship further and to specifically examine how data mining calls into question the purpose limitation principle. Particular attention will be paid to how data mining defies this principle in a way that data analysis tools of the recent past do not.

  • 9.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Data Mining and the Need for Semantic Management2013In: Internationalisation of Law in the Digital Information Society / [ed] Dan Jerker Svantesson, Stanley Greenstein, København: Ex Tuto Publishing, 2013Chapter in book (Other academic)
  • 10.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Europe Versus Facebook: An Imbroglio of EU Data Protection Issues2016In: Data Protection on the Move: Current Developments in ICT and Privacy/Data Protection / [ed] Serge Gutwirth, Ronald Leenes, Paul De Hert, Dordrecht: Springer, 2016, p. 25-50Chapter in book (Refereed)
    Abstract [en]

    In this paper, the case Europe versus Facebook is presented as a microcosm of the modern data protection challenges that arise from globalization, technological progress and seamless cross-border flows of personal data. It aims to shed light on a number of sensitive issues closely related to the case, which namely surround how to delimit the power of a European Data Protection Authority to prevent a specific data flow to the US from the authority of the European Commission to find the entire EU-US Safe Harbor Agreement invalid. This comment will also consider whether the entire matter might have been more clear-cut if Europe-versus-Facebook had asserted its claims against Facebook US directly pursuant to Article 4 of the EU Data Protection Directive, rather than through Facebook Ireland indirectly under the Safe Harbor Agreement.

  • 11.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Exploring the Relationship Between Article 22 of the General Data Protection Regulation and Article 14 of the Proposed AI Act: Some Preliminary Observations and Critical Reflections2023In: Dataskyddet 50 År – Historia, Aktuella problem och Framtid / [ed] Martin Brinnen; Cecilia Magnusson Sjöberg; David Törngren; Daniel Westman; Sören Öman, Visby: Eddy.se , 2023, p. 443-465Chapter in book (Other academic)
  • 12.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Implementing Data Protection by Design in the Ed Tech Context: What is the Role of Technology Providers?2022In: Journal of Law, Technology & the Internet (JOLTI), ISSN 1949-6451, Vol. 13, no 1, p. 84-106Article in journal (Refereed)
    Abstract [en]

    This article explores the specific roles and responsibilities of technology providers when it comes to implementing Data Protection by Design (“DPbD”) and Data Protection by Default (“DPbDf”). As an example, it looks at the Education Technology (“Ed Tech”) sector and the complexities of the supply chains that exist therein to highlight that, in addition to the Higher Education (“HE”) institutions that procure products and services for advancing teaching and learning, Ed Tech vendors may also have responsibility and liability for the processing of student’s personal data. Ultimately, this paper asks whether there are any legal gaps, ambiguities, or normative conflicts to the extent that technology providers can have responsibility in contemporary data processing activities yet escape potential liability where it concerns issues of General Data Protection Regulation (“GDPR”) compliance.

    This paper argues that there is befuddlement concerning the determination of which parties are responsible for meeting DPbD and DPbDf obligations, as well as with regards to the extent of this responsibility. In some cases, an Ed Tech provider is a controller or processor in practice together with a HE institution, yet, in others it, may not have any legal responsibility to support the development of privacy and data-protection preserving systems, notwithstanding the fact it might be much more knowledgeable than a HE institution that has procured the Ed Tech product or service about the state-of-the art of the technology. Even in cases where it is clear that an Ed Tech provider does have responsibility as a controller or processor, it is unclear how it should share DPbD obligations and coordinate actions with HE

    institutions, especially when the Ed Tech supplier may only be involved in a limited way or at a minor phase in the processing of student data. There is an urgent need to recognize the complex, interdependent, and nonlinear context of contemporary data processing where there exists many different controllers, processors, and other actors, processing personal data in different geographical locations and at different points in time for both central and peripheral purposes. Likewise, the complexity of the supply of software must also be emphasized, particularly in contexts such as the supply of educational technology where technology providers can play a key role in the preservation of privacy and data protection rights but may only have a tangential link to the universities that ultimately use their products and services. There is also a need for a more dynamic approach of considering responsibility regarding DPbD. Instead of thinking about responsibilities in terms of “purpose” and “means” the law should shift towards a focus on powers and capacities. The law should also clarify whether technology providers must notify controllers about changes to the state-of-the-art and, if so, to what extent.

  • 13.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    In Search of Data Protection’s Holy Grail Applying Privacy by Design to Lifelogging Technologies2020In: Data Protection and Privacy: Data Protection and Democracy / [ed] Dara Hallinan, Ronald Leenes, Serge Gutwirth, Paul De Hert, Hart Publishing Ltd, 2020Chapter in book (Refereed)
  • 14.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Legal and regulatory challenges to utilizing lifelogging technologies for the frail and sick2019In: International Journal of Law and Information Technology, ISSN 0967-0769, E-ISSN 1464-3693, Vol. 27, no 1, p. 50-74Article in journal (Refereed)
    Abstract [en]

    Lifelogging technologies have the capacity to transform the health and social care landscape in a way that few could have imagined. Indeed, the emergence of lifelogging technologies within the context of healthcare presents incredible opportunities to diagnose illnesses, engage in preventative medicine, manage healthcare costs and allow the elderly to live on their own for longer periods. These technologies, however, require coherent legal regulation in order to ensure, among other things, the safety of the device and privacy of the individual. When producing lifelogging technologies, it is important that developers understand the legal framework in order to create a legally compliant device. The current regulation of lifelogging is highly fragmented, consisting of a complex patchwork of laws. There are also a number of different regulatory agencies involved. Laws and regulations vary, depending on jurisdiction, making development of these technologies more challenging, particularly given the fact that many lifelogging tools have an international dimension.

  • 15.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Legal Implications of Data Mining2019In: Secure Digitalisation / [ed] Cyril Holm, Stockholm: Institutet för rättsinformatik, Stockholms universitet , 2019Chapter in book (Other academic)
  • 16.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law. Institutet för rättsinformatik (IRI), The Swedish Law and Informatics Research Institute.
    Legal Implications of Data Mining: Assessing the European Union’s Data Protection Principles in Light of the United States Government’s National Intelligence Data Mining Practices2016Doctoral thesis, monograph (Other academic)
    Abstract [en]

    This dissertation addresses some of the data protection challenges that have arisen from globalization, technological progress, terrorism and seamless cross-border flows of personal data.  The focus of the thesis is to examine ways to protect the personal data of EU citizens, which may be collected by communications service providers such as Google and Facebook, transferred to the US Government and data mined within the context of American national intelligence surveillance programs.  The work explores the technology of data mining and examines whether there are sufficient guarantees under US law for the rights of non-US persons when it comes to applying this technology in the national-security context.

  • 17.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Legal Implications of Using AI as an Exam Invigilator2021Other (Other academic)
    Abstract [en]

    Universities around the globe have been profoundly affected by stay-at-home orders, which have required them to close their doors and shift to online teaching and learning. In an effort to avoid delaying or postponing examinations amid the Covid-19 outbreak, many higher-education institutions have turned to online proctoring tools, raising complex questions about how they can ensure the integrity of online assessments while at the same time respect ethical and legal constraints, especially regarding students’ fundamental rights to privacy, data protection and non-discrimination. In particular, universities are increasingly relying on AI-based facial recognition technologies (FRT) that can be used to authenticate remote users that connect from offsite the campus as well as to identify cheating and other dubious behavior throughout the online exam process.

  • 18.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Legal Implications of Using AI as an Exam Invigilator2022In: 2020-2021 Nordic Yearbook: Law in the Era of Artificial Intelligence / [ed] Liane Colonna; Stanley Greenstein, The Swedish Law and Informatics Research Institute , 2022, p. 13-46Chapter in book (Refereed)
    Download full text (pdf)
    fulltext
  • 19.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Mo’ Data, Mo’ Problems? Personal Data Mining and the Challenge to the Data Minimization Principle2013Conference paper (Other academic)
  • 20.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Opportunities and Challenges to Utilizing Text-data Mining in Public Libraries: A Need for Legal Research2018In: Scandinavian Studies in Law, ISSN 0085-5944, Vol. 65, p. 191-196Article in journal (Other academic)
  • 21.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Preserving Privacy in Times of Counter Cyber-Terrorism Data Mining2013In: Big Data: Challenges and Opportunities: Proceedings of The 9th International Conference on Internet, Law & Politics, 2013Conference paper (Refereed)
  • 22.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Prism and The European Union’s Data Protection Directive2013In: The John Marshall Journal of Computer & Information Law, ISSN 1078-4128, Vol. 30, no 2, p. 227-252Article in journal (Refereed)
  • 23.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Privacy, Risk, Anonymization and Data Sharing in the Internet of Health Things2020In: Pittsburgh Journal of Technology Law & Policy, E-ISSN 2164-800X, Vol. 20, no 1, p. 148-175Article in journal (Refereed)
    Abstract [en]

    This paper explores a specific risk-mitigation strategy to reduce privacy concerns in the Internet of Health Things (IoHT): data anonymization. It contributes to the current academic debate surrounding the role of anonymization in the IoHT by evaluating how data controllers can balance privacy risks against the quality of output data and select the appropriate privacy model that achieves the aims underlying the concept of Privacy by Design. It sets forth several approaches for identifying the risk of re-identification in the IoHT as well as explores the potential for synthetic data generation to be used as an alternative method to anonymization for data sharing.

  • 24.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Reconciling Privacy by Design with the Principle of Transparency2020In: General Principles of EU Law and the EU Digital Order / [ed] Ulf Bernitz, Xavier Groussot, Jaan Paju, Sybe A. de Vries, Alphen aan den Rijn: Wolters Kluwer, 2020Chapter in book (Other academic)
  • 25.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Reflections on the Use of AI in the Legal Domain2021In: Law and Business, ISSN 2720-1279, Vol. 1, no 1, p. 1-10Article in journal (Other academic)
    Abstract [en]

    This paper examines the field of Artificial Intelligence (AI) and Law and offers some broad reflections on its current state. First, the paper introduces the concept of AI, paying particular attention to the distinction between hard and soft AI. Next, it considers how AI can be used to support (or replace!) legal work and legal reasoning. The paper goes on to explore applications of AI in the legal domain and concludes with some critical reflections on the use of AI in the legal context.

  • 26.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Schrems vs. Commissioner: A Precedent for the CJEU to Intervene in the National Intelligence Surveillance Activities of Member States?2016In: Europarättslig tidskrift, ISSN 1403-8722, E-ISSN 2002-3561, no 2, p. 208-224Article in journal (Other academic)
  • 27.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Teachers in the loop? An analysis of automatic assessment systems under Article 22 GDPR2023In: International Data Privacy Law, ISSN 2044-3994, E-ISSN 2044-4001, Vol. 14, no 1, p. 3-18Article in journal (Refereed)
    Abstract [en]

    Key Points

    • This article argues that while there is great promise in the everyday automation of higher education to create benefits for students, efficiencies for instructors, and cost savings for institutions, it is important to critically consider how AI-based assessment will transform the role of teachers and the relationship between teachers and students.
    • The focus of the work is on exploring whether and to what extent the requirements set forth in Article 22 of the General Data Protection Regulation (GDPR) apply within the context of AI-based automatic assessment systems, in particular the legal obligation to ensure that a teacher remains in the loop, for example being capable of overseeing and overriding decisions when necessary.
    • Educational judgments involving automatic assessments frequently occur in a complicated decision-making environment that is framed by institutional processes which are multi-step, hierarchical, and bureaucratic. This complexity makes it challenging to determine whether the output of an AI-based automatic assessment system represents an ‘individual decision’ about a data subject within the meaning of Article 22.
    • It is also unclear whether AI-based assessments involve decisions based ‘solely’ on automatic processing or whether teachers provide decisional support, excluding the application of Article 22. According to recent enforcement decisions, human oversight is entangled with institutional procedures and safeguards as well as system design.
  • 28.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    The AI Act’s Research Exemption: A Mechanism for Regulatory Arbitrage?2023In: The Yearbook of Socio-Economic Constitutions: Law and the Governance of Artificial Intelligence / [ed] Andreas Moberg; Eduardo Gill-Pedro, Cham: Springer, 2023Chapter in book (Refereed)
  • 29.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    The AI Regulation and Higher Education: Preliminary Observations and Critical Perspectives2022In: Law, AI and Digitalisation / [ed] Katja de Vries; Mattias Dahlberg, Uppsala: Iustus förlag, 2022, p. 333-356Chapter in book (Refereed)
    Download full text (pdf)
    De Lege
  • 30.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    The New EU Proposal To Regulate Data Protection in the Law Enforcement Sector: Raises the Bar But Not High Enough2012Report (Other academic)
  • 31.
    Colonna, Liane
    et al.
    Stockholm University, Faculty of Law, Department of Law.
    Dignum, Virginia
    Filatova-Bilous, Natalia
    Friberg, Sandra
    Haynie-Lavelle, Jess
    Magnusson Sjöberg, Cecilia
    Stockholm University, Faculty of Law, Department of Law.
    Muller, Catelijne
    Munetsi, Dennis
    Razmetaeva, Yulia
    Strange, Michael
    Tucker, Jason
    Community Reference Meeting: Challenges and Opportunities of Regulating AI2022Report (Other academic)
  • 32.
    Colonna, Liane
    et al.
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Greenstein, Stanley
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Nordic Yearbook of Law and Informatics 2020–2021: Law in the Era of Artificial Intelligence2022Book (Refereed)
    Download full text (pdf)
    fulltext
  • 33. Flórez-Revuelta, Francisco
    et al.
    Mihailidis, Alex
    Ziefle, Martina
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Spinsante, Susanna
    Privacy-Aware and Acceptable Lifelogging Services for Older and Frail People: the PAAL Project2018In: Proceedings 2018 IEEE 8th International Conference onConsumer Electronics - Berlin (ICCE-Berlin) / [ed] Reinhard Moeller, LucioLucio Ciabattoni, IEEE, 2018Conference paper (Refereed)
    Abstract [en]

    Developed countries around the world are facing crucial challenges regarding health and social care because of the demographic change and current economic context. Innovation in technologies and services for Active and Assisted Living stands out as a promising solution to address these challenges, while profiting from the economic opportunities. For instance, lifelogging technologies may enable and motivate individuals to pervasively capture data about them, their environment and the people with whom they interact, in order to receive a variety of services to increase their health, well-being and independence. In this context, the PAAL project presented in this paper has been conceived, with a manifold aim: to increase the awareness about the ethical, legal, social and privacy issues associated to lifelogging technologies; to propose privacy-aware lifelogging services for older people, evaluating their acceptability issues and barriers to familiarity with technology; to develop specific applications referred to relevant use cases for older and frail people.

  • 34. Mihailidis, Alex
    et al.
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    A Methodological Approach to Privacy by Design within the Context of Lifelogging Technologies2020In: Rutgers Computer & Technology Law Journal, ISSN 0735-8938, Vol. 46, no 1, p. 1-52Article in journal (Refereed)
    Abstract [en]

    Lifelogging technologies promise to manage many of the concerns raised by population aging. The technology can be used to predict and prevent disease, provide personalized healthcare, and to give support to formal and informal caregivers. Although lifelogging technologies offer major opportunities to improve efficiency and care in the healthcare setting, there are many aspects of these devices that raise serious privacy concerns that can undercut their use and further development. One way to manage privacy concerns raised by lifelogging technologies is through the application of Privacy by Design, an approach that involves embedding legal rules into information systems at the outset of their development. Many current approaches to Privacy by Design, however, lack methodological rigor, leaving stakeholders perplexed about how to achieve the objectives underlying the concept in practice.

    This paper will explore ways to develop a Privacy by Design methodology within the context of Ambient Assistive Living (AAL) technologies like lifelogging. It will set forth a concrete, methodological approach towards incorporating privacy into all stages of a lifelogging system's development. The methodology begins with a contextual understanding of privacy, relying on theoretical and empirical studies conducted by experts in humancomputer relations. It then involves an analysis of the relevant black-letter law. A systematic approach as to how to incorporate the requisite legal rules into lifelogging devices is then presented, taking into the account the specific design elements of these kinds of systems.

  • 35. Wilkowska, Wiktoria
    et al.
    Offermann, Julia
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law, The Swedish Law and Informatics Research Institute.
    Florez-Revuelta, Francisco
    Climent-Pérez, Pau
    Mihailidis, Alex
    Poli, Angelica
    Spinsante, Susanna
    Ziefle, Martina
    Interdisciplinary perspectives on privacy awareness in lifelogging technology development2023In: Journal of Ambient Intelligence and Humanized Computing, ISSN 1868-5137, E-ISSN 1868-5145, Vol. 14, no 3, p. 2291-2312Article in journal (Refereed)
    Abstract [en]

    Population aging resulting from demographic changes requires some challenging decisions and necessary steps to be taken by different stakeholders to manage current and future demand for assistance and support. The consequences of population aging can be mitigated to some extent by assisting technologies that can support the autonomous living of older individuals and persons in need of care in their private environments as long as possible. A variety of technical solutions are already available on the market, but privacy protection is a serious, often neglected, issue when using such (assisting) technology. Thus, privacy needs to be thoroughly taken under consideration in this context. In a three-year project PAAL (‘Privacy-Aware and Acceptable Lifelogging Services for Older and Frail People’), researchers from different disciplines, such as law, rehabilitation, human-computer interaction, and computer science, investigated the phenomenon of privacy when using assistive lifelogging technologies. In concrete terms, the concept of Privacy by Design was realized using two exemplary lifelogging applications in private and professional environments. A user-centered empirical approach was applied to the lifelogging technologies, investigating the perceptions and attitudes of (older) users with different health-related and biographical profiles. The knowledge gained through the interdisciplinary collaboration can improve the implementation and optimization of assistive applications. In this paper, partners of the PAAL project present insights gained from their cross-national, interdisciplinary work regarding privacy-aware and acceptable lifelogging technologies. 

  • 36. Wilkowska, Wiktoria
    et al.
    Offermann-van Heek, Julia
    Colonna, Liane
    Stockholm University, Faculty of Law, Department of Law.
    Ziefle, Martina
    Two Faces of Privacy: Legal and Human-Centered Perspectives of Lifelogging Applications in Home Environments2020In: Human Aspects of IT for the Aged Population. Healthy and Active Aging: Proceedings, Part II / [ed] Qin Gao, Jia Zhou, Springer, 2020, p. 545-564Conference paper (Refereed)
    Abstract [en]

    In view of the consequences resulting from the demographic change, using assisting lifelogging technologies in domestic environments represents one potential approach to support elderly and people in need of care to stay longer within their own home. Yet, the handling of personal data poses a considerable challenge to the perceptions of privacy and data security, and therefore for an accepted use in this regard. The present study focuses on aspects of data management in the context of two different lifelogging applications, considering a legal and a human-centered perspective. In a two-step empirical process, consisting of qualitative interviews and an online survey, these aspects were explored and evaluated by a representative German sample of adult participants (N = 209). Findings show positive attitudes towards using lifelogging, but there are also high requirements on privacy and data security as well as anonymization of the data. In addition, the study allows deep insights into preferred duration and location of the data storage, and permissions to access the personal information from third parties. Knowledge of preferences and requirements in the area of data management from the legal and human-centered perspectives is crucial for lifelogging and must be considered in applications that support people in their daily living at home. Outcomes of the present study considerably contribute to the understanding of an optimal infrastructure of the accepted and willingly utilized lifelogging applications.

1 - 36 of 36
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf