Jump to content

Aleksandra Korolova

From Wikipedia, the free encyclopedia
Aleksandra Korolova
Alma materStanford University (PhD)
Awards
Scientific career
Fields
InstitutionsPrinceton University
ThesisProtecting Privacy When Mining and Sharing User Data (2012)
Doctoral advisorAshish Goel[1]
Websitewww.korolova.com

Aleksandra Korolova is a Latvian–American computer scientist. She is an assistant professor at Princeton University. Her research develops privacy-preserving and fair algorithms, studies individual and societal impacts of machine learning and AI, and performs AI audits for algorithmic bias.

Education

[edit]

Korolova earned her undergraduate degree from the Massachusetts Institute of Technology and completed her doctoral degree at Stanford University.[2]

Research and career

[edit]

Privacy

[edit]

Korolova early research examined ways to make internet searches more anonymous.[3] Her research was one of the first to identify privacy vulnerabilities in targeted advertising systems.[4][5]

Korolova's work led to the first industry deployment of differential privacy, Google's RAPPOR,[6][7] demonstrating its feasibility in thelocal model and motivating significant interest in developing algorithms for this model of privacy in the academic literature.

RAPPOR was runner-up for the PET Award for outstanding research in privacy-enhancing technologies in 2015[8] and received the Association for Computing Machinery CCS Test-of-Time Award in 2025.[9]

Algorithmic Fairness

[edit]

Korolova developed new black-box audit methodologies for isolating the role of ad delivery algorithms from other confounding factors.[10] Through the use of this tool she was able to demonstrate that people cannot opt out of ads that are based on a user's location.[11] Her application of these methodologies demonstrated that Facebook's ad delivery algorithms lead to discriminatory outcomes in housing and employment advertising[12][13] though LinkedIn did not show a similar bias.[14][15] Her research also examined a filter bubble in political ad delivery.[16] The findings led to a 2022 settlement[17] between the U.S. Department of Justice and Meta, requiring Meta to modify its ad delivery system.

In 2024 Korolova reported on an interaction with AI during which the AI first indicated that it had a child with special needs.[18]

Recognition

[edit]

References

[edit]
  1. ^ Aleksandra Korolova at the Mathematics Genealogy Project
  2. ^ "Faculty Directory". Princeton Department of Computer Science.
  3. ^ Marks, Paul (March 2009). "Noise could keep web searchers' IDs private". New Scientist ; London. Vol. 201, no. 2698. p. 20.
  4. ^ Korolova, Aleksandra (2011). "Privacy Violations Using Microtargeted Ads: A Case Study". Journal of Privacy and Confidentiality. 3. doi:10.29012/jpc.v3i1.594.
  5. ^ Heft, Miguel (Oct 22, 2010). "Marketers Can Glean Private Data on Facebook". New York Times.
  6. ^ Úlfar Erlingsson, Vasyl Pihur, and Aleksandra Korolova (2014). "RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response". Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security. pp. 1054–1067. arXiv:1407.6981. doi:10.1145/2660267.2660348. ISBN 978-1-4503-2957-6.{{cite book}}: CS1 maint: multiple names: authors list (link)
  7. ^ Erlingsson, Úlfar. "Learning statistics with privacy, aided by the flip of a coin". Google Security Blog.
  8. ^ "PET Award".
  9. ^ ACM Special Interest Group on Security, Audit and Control (SIGSAC). "ACM CCS Test-of-Time Award".
  10. ^ Hao, Karen (April 9, 2021). "Facebook's ad algorithms are still excluding women from seeing jobs". MIT Technology Review.com; Cambridge: Technology Review, Inc.
  11. ^ Hern, Alex (December 19, 2018). "Facebook users cannot avoid location-based ads, investigation finds". The Guardian (Online) Guardian News & Media Limited.
  12. ^ Muhammad Ali, Piotr Sapiezynski, Miranda Bogen, Aleksandra Korolova, Alan Mislove, and Aaron Rieke. "Discrimination through Optimization: How Facebook's Ad Delivery Can Lead to Biased Outcomes". Proceedings of the ACM on Human-Computer Interaction. 3: 1–30. arXiv:1904.02095. doi:10.1145/3359301.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  13. ^ Basileal Imana, Aleksandra Korolova, and John Heidemann (2021). "Auditing for Discrimination in Algorithms Delivering Job Ads". Proceedings of the Web Conference 2021. pp. 3767–3778. arXiv:2104.04502. doi:10.1145/3442381.3450077. ISBN 978-1-4503-8312-7.{{cite book}}: CS1 maint: multiple names: authors list (link)
  14. ^ O'Brien, Matt; Ortutay, Barbara (2021-04-10). "Facebook's ads biased". Daily News. p. 19. Retrieved 2025-01-22.
  15. ^ Horwitz, Jeff (April 9, 2021). "Facebook Algorithm Shows Gender Bias in Job Ads, Study Finds; Researchers found the platform's algorithms promoted roles to certain users; company pledges to continue work in removing bias from recommendations". Wall Street Journal (Online) Dow Jones & Company Inc.
  16. ^ Muhammad Ali, Piotr Sapiezynski, Aleksandra Korolova, Alan Mislove, and Aaron Rieke (2021). "Ad Delivery Algorithms: The Hidden Arbiters of Political Messaging". Proceedings of the 14th ACM International Conference on Web Search and Data Mining. pp. 13–21. doi:10.1145/3437963.3441801. ISBN 978-1-4503-8297-7.{{cite book}}: CS1 maint: multiple names: authors list (link)
  17. ^ "United States v. Meta Platforms, Inc., f/k/a Facebook, Inc. (S.D.N.Y.)". Civil Rights Division, U.S. Department of Justice. 21 June 2022.
  18. ^ Al-Sibai, Noor (2024-04-18). "Meta's AI Is Telling Users It Has a Child". Futurism. Retrieved 2025-01-22.
  19. ^ Widom, Jennifer. "Stanford Computer Science 2013 Newsletter".
  20. ^ "PET Award for Outstanding Research in Privacy Enhancing Technologies".
  21. ^ "Conference Programs".
  22. ^ "NSF Award Search: Award # 1943584 - Career: Towards Privacy and Fairness in Multi-Sided Platforms".
  23. ^ "2024 Sloan Research Fellows".
  24. ^ Biden White House. "President Biden Honors Nearly 400 Federally Funded Early-Career Scientists".
[edit]