The Pulse #96: Apple demonstrates AI is best as many small features
Apple showcased how generative AI will spread across its operating systems, and how users can expect it to be free. Also: a new standard in confidential computing, and an outage “caused” by ChatGPT.
The Pulse is a series covering insights, patterns, and trends within Big Tech and startups. Notice an interesting event or trend? Send me a message.
Today, we cover:
Industry pulse. Microsoft will not ship Recall with glaring security holes and will fix these; Twitter forgets to collect laptop from fired employee for 1.5 years; Regions where it’s easier to raise pre-seed funding; Microsoft’s performance review cycle in progress, and more.
Apple demonstrates AI is best as many small features. Apple showcased dozens of generative AI-powered operating system-level improvements for coming versions of iOS, iPadOS and MacOS operating systems. It’s the most convincing demonstration yet of how GenAI powered features will be useful on smartphones, day to day.
A new standard in confidential computing: Apple Private Cloud Compute. Apple takes user data privacy seriously, and has launched the most secure cloud backend around. It offers to safeguard user data processed by powerful AI models running on Apple’s cloud. It’s a new bar in verifiable security.
Who’s to blame; ChatGPT or a dev? An early-stage startup suffered an outage which meant customers could not purchase a subscription for five days. It turned out ChatGPT generated the buggy lines of code that caused the problem. But is it fair to blame a hammer when you bang your thumb with it, or is the tool actually at fault this time?
1. Industry pulse
Recall recalled
Last week, Microsoft faced warranted criticism for attempting to ship a highly invasive continuous screenshot-taking feature (Recall) as a default opt-in, with zero regards to basic security practices. I wrote I couldn’t see how Microsoft could ship Recall without fixing these basic security gaps.
This week, Microsoft decided the same. The company will now switch the feature off by default, encrypt data, and require authentication to access Recall’s stored data. The incident is another example of Microsoft inexplicably failing to follow basic security practices for operating system-level features. My hunch is that the tech giant did it deliberately after calculating that delivering the feature quickly for the Copilot+ PC launch was more important than building it properly. If so, it’s yet another sign that Microsoft really needs to focus better on security basics.