Apple’s Unveiling of AI Features at WWDC 2024: A Legal Perspective
The Case for Apple Intelligence
Apple, the tech giant synonymous with sleek design and user-friendly interfaces, has unveiled its latest AI features under the umbrella term Apple Intelligence. These features aim to simplify our daily tasks. Let us present the evidence:
Exhibit A: Photo Editing with AI in Photos
The Photos app now wields AI like a seasoned litigator cross-examining evidence. Object removal appears to be a lot smoother.
Whether you’re erasing red-eye from a family portrait or obliterating that photobombing squirrel, AI assists in achieving pixel-perfect results.
Legal risks? Fake photos and evidential value of photos will likely increase. Also, Apps that are used for anti-money laundering and background checks will need to adjust (if not done so already).
Exhibit B: Siri’s Enhanced Control
Siri, the digital legal assistant, is now to wield more power and to use ChatGPT. It can delete emails, edit photos, and perform other tasks with perhaps the precision of a seasoned paralegal.
This could assist legal teams with improvements in research and case management.
Legal risks? Remember the cases last year where lawyers were found to have used fake case citations, including Michael Cohen, that were generated by AI. Consider also data retention policies before instructing email deletions. Care, caution and lots of cross-checking still apply.
Exhibit C: Quick Recaps and Smart Legal Suggestions
Apple Intelligence is expected to provide succinct recaps of case notes, transcripts, and legal briefs.
Additionally, the system can suggest a variety of responses for emails and messages, streamlining our communication with clients and colleagues.
Legal risks? As with all communications, an executive summary is very welcomed by clients but must be checked and rechecked for accuracy.
Exhibit D: Transcription of Voice Memos: Verbatim, Your Honor
Voice memos, once mere audio evidence, can be automatically transcribed. No more deciphering cryptic scribbles taken during meetings.
Legal risks? For those who remember digital voice dictation software, it was fraught with errors and needed many hours of training. Further, there is jargon, nuance and those latin terms that may not clearly translate,
Exhibit E: The Mail App: Organised Evidence Management
The Mail app, akin to a well-organised legal file, is expected to categorise emails effectively. It even generates boilerplate responses.
Managing our digital caseload becomes more efficient, and our inbox is no longer a chaotic discovery pile.
Legal risks? For those using Outlook and standard template responses, this might not be new to you. The key legal risk here is losing control over your communications with clients and colleagues. Again, care, caution and lots of cross-checking still apply.
Data Protection and Cybersecurity
Now, let us pivot to the protection of user data and safeguarding against cyber threats.
Point I: iMessage Contact Key Verification
iMessage is expected to offer contact key verification.
Sensitive information—be it backups, photos, or notes—will remain encrypted, even during transmission between devices and servers.
Point II: Security Keys for Apple ID:
Users can opt for a physical security key to access their Apple ID account. This means only authorised individuals can unlock their digital identity.
This additional layer of protection enhances account security, akin to presenting a valid ID in person (eg, election voting; airport passport checks).
Point III: Advanced Data Protection for iCloud
Sensitive information—be it backups, photos, or notes—will remain encrypted, even during transmission between devices and servers.
Point IV: Threat Detection in Safari
Safari, Apple’s web browser, is said to includeadvanced threat detection capabilities.
Malicious websites trigger alerts, preventing users from unwittingly exposing themselves to phishing attacks or malware.
Point V: App Privacy Reports
Apple has introduced App Privacy Reports, allowing users to see how apps handle their data.
Before granting permissions, users can assess an app’s track record—whether it respects privacy or overreaches.
Point VI: On-Device AI Processing
Whether it’s voice recognition or personalised recommendations, AI will process some of this data locally (ie, on the user’s device), with the intention of preserving privacy.
The capability will mean Apple is “aware of your personal data without collecting your personal information,” according to Apple.
Where more computing power is needed to address a user’s query, Apple’s “private cloud computing,” is used. A user’s information may be sent to a secure server to be processed but Apple says it will not be stored.
Conclusion: As with all technology updates, the users and developers will be the true judges on whether these updates will benefit or hinder. Based on initial assessment of the Keynote presentation, the data privacy elements appear at least to be positive. In case you wondered, was there anything else of note, well there is Apple’s Vision Pro - a VR headset - that is coming to the UK on 12 July 2024. There are many legal risks arising around wearable technology of this kind but that’s for another article. To be continued …
Subscribe Now: RMOK will publish regular updates on AI regulation, practical steps CEO’s, Directors and Startups may adopt towards compliance. Receive real life case studies, insights, general briefs and rolling legal updates which keep you ‘in the loop’. At RMOK…we look after it.