Tech2 hrs ago

Apple Hosts Workshop on Privacy‑Preserving Machine Learning and AI

Apple reiterated privacy as a fundamental human right and shared advances from its two‑day workshop on privacy‑preserving machine learning, covering federated learning, foundation models, and secure computation.

Alex Mercer/3 min/NG

Senior Tech Correspondent

TweetLinkedIn
Apple Machine Learning Research

Apple Machine Learning Research

Source: MachinelearningOriginal source

TL;DR Apple reiterated that privacy is a fundamental human right and showcased its latest research at a two‑day workshop on privacy‑preserving machine learning and AI, covering private learning, foundation models, and security attacks.

Context

Apple has long positioned privacy as a core value, stating that it is a fundamental human right. As artificial intelligence becomes more embedded in everyday products, the company says protecting user data while enabling innovative features is essential. To advance this goal, Apple brought together its researchers and external experts for a focused event on privacy‑preserving techniques. The event attracted academics from universities in Europe and North America, as well as engineers from Apple’s AI and privacy teams.

Key Facts

The Workshop on Privacy‑Preserving Machine Learning & AI took place earlier this year and lasted two days. Sessions were organized around three themes: private learning and statistics, foundation models and privacy, and attacks and security. Presentations included topics such as federated learning, differential privacy, homomorphic encryption, and methods to mitigate memorization in large language and diffusion models. Apple researchers also shared work on combining machine learning with homomorphic encryption within its ecosystem. Over thirty talks were delivered, covering both theoretical advances and practical implementations.

What It Means

The workshop signals Apple’s commitment to grounding AI development in rigorous privacy safeguards. By highlighting approaches like federated learning and secure computation, the company shows how it intends to balance performance with data protection. The discussions also reveal ongoing challenges, particularly around securing foundation models against inference attacks, which will shape future research directions.

Apple plans to publish recordings and papers from the event, allowing the broader community to build on the presented work. Observers should watch for how these privacy‑preserving methods translate into upcoming iOS, macOS, and Siri features, and whether Apple will adopt new standards for AI transparency and user control. Developers may soon see updated APIs that simplify integration of differential privacy into third‑party apps.

TweetLinkedIn

More in this thread

Reader notes

Loading comments...