- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
cross-posted from: https://lemmy.world/post/16391311
Andrej Karpathy endorses Apple Intelligence
Actually, really liked the Apple Intelligence announcement. It must be a very exciting time at Apple as they layer AI on top of the entire OS. A few of the major themes.
Step 1 Multimodal I/O. Enable text/audio/image/video capability, both read and write. These are the native human APIs, so to speak.
Step 2 Agentic. Allow all parts of the OS and apps to inter-operate via “function calling”; kernel process LLM that can schedule and coordinate work across them given user queries.
Step 3 Frictionless. Fully integrate these features in a highly frictionless, fast, “always on”, and contextual way. No going around copy pasting information, prompt engineering, or etc. Adapt the UI accordingly.
Step 4 Initiative. Don’t perform a task given a prompt, anticipate the prompt, suggest, initiate.
Step 5 Delegation hierarchy. Move as much intelligence as you can on device (Apple Silicon very helpful and well-suited), but allow optional dispatch of work to cloud.
Step 6 Modularity. Allow the OS to access and support an entire and growing ecosystem of LLMs (e.g. ChatGPT announcement).
Step 7 Privacy. <3
We’re quickly heading into a world where you can open up your phone and just say stuff. It talks back and it knows you. And it just works. Super exciting and as a user, quite looking forward to it.
Privacy?
Apple secures third party audits for their devices and designs, additionally security researchers have methods of verifying certain aspects of device behavior. People dig into stuff and Apple has not only a history of good privacy design, but as far as I’m aware they’ve never been caught doing anything remotely out of scope of their tight knit privacy policies with user data. Your complaint is baseless.
That’s every interesting point. Let’s read a bit of their privacy policy :
Hum, interesting… What do they need?
This one is particularly interesting, very very ambitious, this could be anything! What is the device trust score?
Ok, it’s difficult for them to violate their own privacy policy when they literally reserve themselves the right to get anything they want.
BTW, some user-rights associations are not in agreement with Apple on this. Here is one example I could fine very quick: France fines Apple over App Store ad targeting ePrivacy breach | TechCrunch
You do understand they use all this data to provide services using it and as such they have to disclose that in their privacy policy, right? For example, health data collection, is literally required to be disclosed to offer health services such as step tracking. You’re way way off base here.
Come on… You telling me they have to collect all this data to offer the service ? They have to send your health information to the main server to provide you your daily step count ?
Having it all disclose in the privacy policy is a formality. Honestly, have you read it carefully before buying your first iPhone? And review it again every time they’ve updated it?
Today you seem satisfied with the amount of data they collect about you and how they use it. Where do you set the boundaries? When would it be too far for you, making you reconsider using these products?
I am not trying to offend you. I am legitimately interested.
No, but they have to disclose all possible avenues of collection. I for one like storing my health data in icloud for processing and retention. They take that data, run it through algorithms, and use it to provide me things like estimated sleep cycle details.
Yes. Also yes. I find quite a bit of it distasteful, but as a systems administrator I have to be informed of all privacy policies guiding the disclosure and use of company data. It sucks, they’re lengthy and overwhelming, and often you’re right they do ask for too much but at the end of the day it’s less than you’d expect and they never make their money selling it, which is more than you can say about any software company of Apple’s scale.
If I set the boundaries they’d have none. That’s my preference and why I E2E encrypt everything on my device. I’d give up features and self host if I could, but all of that just isn’t possible for your average user or for them to stay competitive in their business model. Users don’t want to know what E2E is, they don’t want things “losable”, and honestly don’t care about their privacy (check the privacy policy of meta and TikTok vs Apple if you don’t believe me that there’s a difference and the vast majority do not care). That being said Apple provides what I see as the best middle ground. Enough privacy to remain confident my data is secure (E2E icloud backups, E2E messaging, etc) but enough gathering to keep their services competitive with more lucrative competitors with looser policies. Oh. And it would be too far when they started selling it to third party companies. That’s what msde me leave my android phone behind, when Google started migrating all the apis to Google Play Services instead of ASOP apis.
No offense taken, I understand your rage and I agree with your sentiment. They ask too much. But when you compare the other options, it’s the safest path in my honest opinion.
Saying Apple is much better than TikTok, Meta and Google, is just setting the bare very low… Apple advertise heavily about how privacy oriented they are, it’s only fair to compare them to other privacy champions (regardless of company size) e.g. Proton, Signal.
I absolutely agree with you. It’s not possible for most to be a security conscious self-hosting gourou, and we can’t expect for everyone to care about digital privacy and security. It’s why I see it as our duty, as more tech-savy people, to set the standard for the industry, advocate and lobby for the companies to do better.
You don’t ask nurse to set the safety standards for electrical installations, but everyone benefits from electricians and engineers defining and setting these safety standards !
It’s actually not that people don’t care about privacy. It’s they don’t know about privacy, because they don’t think about it, and it’s alright. But ask someone random to hand you over their phone unlocked and start browsing their photos. They will care.
Anyway, I’m glad we could have an educating conversation here. I got to learn a couple of things :)
Did you even watch wwdc. Apple showed how its servers and data from device to the cloud were going to work. The entire thing is open to third party scrutiny and moreover, their servers won’t run on anything that isn’t. They are as transparent as basically Mullvad with ensuring you don’t come across a scenario like “trust me it’s secure bro.” Craig even joked that most businesses that rolled out AI privacy are doing just that.
Or would you rather just be an edgelord.
comparing with mullvad is ridiculous and just shows how much you drank the apple juice without questioning
“leave alone the multi billion dollar corporation” energy
Wow another idiot who did zero research into Apple’s implementation and just uses the “they are a billion dollar company” slam.
Not saying it’s better than Mullvad but it’s not nearly as poor as other VPNs. Also no, Apple doesn’t send your contact info or other details into the cloud, all that shit is anonymized.
I mean at least do SOME research into the service you’re slamming and saying is just as insecure as CoPilot+ ffs.
Yes. Lemmy needs hate Apple. We get it.
Are you sure about that ? It is written in their Legal Privacy Policy in the chapter “Personal Data Apple Collects from You”:
Doesn’t seems so anonymized to me…
Go watch their WWDC 2024 keynote on how their cloud computing is set up to handle privacy and the sharing of information from device to cloud. And how they setup their proprietary servers.
As for generally collecting details (like contact info, location info, payment details, and even government issues IDs) how do you propose they offer services like Apple Card, Apple Pay, Digital Driver’s Licenses, etc. without knowing that info???
I swear half of Lemmy doomsday preppers! We are talking about secure transmission from device to cloud while respecting your privacy; not using Apple services as a ghost. They are not aiming to replace Mullvad; they want to build AI on top of a secure platform that doesn’t shine a flood light on your private life.
Scour their UELA all you want, really has nothing to do with the conversation; we all know they have info on you as we all know smoking causes cancer.
No source-code, no trust. How good is my data on their super secure servers if they have the encryption keys.
How good is a 3rd party scrutiny if those are mandated and paid by Apple to make the audits?
They said people will be able to independently verify their claims.
You can’t independently verify without source code. Will it be published?
You almost certainly run all of your software on code you lack access to the source for. Firmware and etc has been completely proprietary for ages. There’s even a tiny proprietary os embedded in almost every processor on the planet. Your statement lacks context of computing and shows a misplacement of trust.
https://www.zdnet.com/article/minix-intels-hidden-in-chip-operating-system/
https://en.m.wikipedia.org/wiki/AMD_Platform_Security_Processor
Intel and AMD security platforms are shit, I know they exist, I don’t trust them either, but the threat model is very different from this AI application. They don’t upload my photos to a server out of my control.
They’re not just security platforms. They low level computer systems with entire bespoke operating systems and better-than-kernel level access to the system (networking, etc). You have no idea what you’re talking about. Please inform yourself.
I may not be very well informed about it, I admit. Would you have documents to share ? All I seem to find about Intel security engine is about how closed it is and how it has been exploited by bad actors in the past to gain elevated privileges in targeted hacks. It sucks, that’s for sure. But it doesn’t have much to do with the mass data collection employed by Google, Apple, Open AI, etc.
I unfortunately don’t have much to share beyond a decent understanding of compute systems at an enterprise scale (where we utilize these low level subprocessors to do various things such as gather asset data or deploy operating system configurations, see: https://en.m.wikipedia.org/wiki/Intel_Active_Management_Technology). The point I’m trying to make though is that current operating models don’t allow for system trust. If you can’t trust apple with high level data like that needed for llm models on-device (which is how they’ve configured it, requiring a specific user approval and interaction before forwarding minimal data to private process servers) then you shouldn’t trust any device that lacks a complete open boot/firmware/ and OS stack because if these companies were going to exploit your data that egregiously, they already have the lowest level (best) access possible to a system that can transparently (without your knowledge) access encryption enclaves, networking, and storage. Truly open alternatives do exist by the way (see Coreboot, etc) but you’re going to be looking at devices 10-20 years old since almost the entire industry runs proprietary at that level and it takes time for the less heavily funded community players to get up to speed.
Again you’re just talking out your ass. Anyway.