Apple changed its business case. Isn’t that supposed to be acknowledged in SEC filings? Its mission used to be putting a computer on every desktop, then every pocket then… CSAM, every photo – the child pornography surveillance embedded by its operating system. Apple has strategically repurposed its software, hardware, AppStore and mission of the company from computing to a Man in the Middle business.
I put out an earlier shorttake on Apple’s CSAM privacy ambitions. But that missed the huge pivot we are witnessing. Every Apple device is a platform firmly planting Apple as a man in the middle trojan horse. Apple AppStore franchise is legally positioned at-hand and in your pocket, literally. If you use Apple Pay its in your wallet! All subscribing, purchases, pornography, music, email, calls and photos are its business - its Man in the Middle provides unbridled access. Apple can monetize, scrutinize or change each whether its AAC digital music, computational photography, AppStore content, email content, call logs or even your viewing pleasure. Apple Studios extends Apple influence to even small screen content with AppleTV subscription.
Epic’s lawsuit against Apple collapsed into a very expensive win in court that ruled in Epic’s favor that AppStore needed to allow free enterprising developers alternatives for AppStore customers. But the court ruled against Epic for Apple’s AppStore’s freedom to continue its reign without further hindrance to monopoly questions. Subsequently, Apple took a historical “lockout” monopolist action not allowing Epic back onto its AppStore. Apple’s AppStore use of lockout tactics brought union busting into the digital age busting a developer who stood for the right to free market capitalism to court and won but a pyrrhic victory.
Apple’s steampunk actions provide free and clear extension to an American criminal past enabling digital reemergence of a form of capitalism once legally restrained now enabled by force of law.
Steampunk Capitalism
08 Dec 2022
Apple is reportedly ditching a controversial plan to scan users’ photos stored in iCloud for child sexual abuse material, or CSAM, amid an ongoing privacy push.
These safety tools, announced in August 2021, were meant to flag illicit content while preserving privacy. But the plans drew widespread criticism from digital rights groups who argued that the surveillance capabilities were ripe for potential abuse.