Algorithmic Fairness and Secure Information Flow (Extended Abstract)

Reviewed Paper In Proceedings

Author(s):Bernhard Beckert, Michael Kirsten, and Michael Schefczyk
In:European Workshop on Algorithmic Fairness (EWAF '22), Lightning round track
Year:2022
PDF:
URL:https://sites.google.com/view/ewaf22/accepted-papers

Abstract

The concept of enforcing secure information flow is well studied in computer science in the context of information security: If secret information may “flow” through an algorithm or program in such a way that it can influence the program’s public output, this is considered insecure information flow, as attackers could potentially observe (parts of) the secret. There is a wide spectrum of methods and tools to analyse whether a given program satisfies a given definition of secure information flow.
We argue that there is a strong correspondence between secure information flow and algorithmic fairness: if protected attributes such as race, gender, or age are treated as secret program inputs, then secure information flow means that these “secret” attributes cannot influence the result of a program.

BibTeX

@InProceedings{BeckertKirstenSchefczyk2022,
  author                = {Bernhard Beckert and
                           Michael Kirsten and
                           Michael Schefczyk},
  title                 = {Algorithmic Fairness and Secure Information
                           Flow (Extended Abstract)},
  booktitle             = {European Workshop on Algorithmic Fairness ({EWAF} '22),
                           Lightning round track},
  editor                = {Christoph Heitz and
                           Corinna Hertweck and
                           Eleonora Vigan{\`{o}} and
                           Michele Loi},
  month                 = jun,
  year                  = {2022},
  abstract              = {The concept of enforcing secure information flow is well studied in computer science
                           in the context of information security: If secret information may “flow” through an
                           algorithm or program in such a way that it can influence the program’s public output,
                           this is considered insecure information flow, as attackers could potentially observe
                           (parts of) the secret. There is a wide spectrum of methods and tools to analyse
                           whether a given program satisfies a given definition of secure information flow.
                           \newline

                           We argue that there is a strong correspondence between secure information flow and
                           algorithmic fairness: if protected attributes such as race, gender, or age are
                           treated as secret program inputs, then secure information flow means that these
                           “secret” attributes cannot influence the result of a program.},
  url                   = {https://sites.google.com/view/ewaf22/accepted-papers},
  venue                 = {Z{\"{u}}rich, Switzerland},
  eventdate             = {2022-06-08/2022-06-09}
}