- cross-posted to:
- technology@beehaw.org
- hackernews@lemmy.smeargle.fans
- cross-posted to:
- technology@beehaw.org
- hackernews@lemmy.smeargle.fans
Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.
Remember, this probably isn’t an either or thing. Both Apple and Google have been offloading certain AI tasks to devices to speed up response time and process certain requests offline.
Yep, though Google is happy to process your data in the cloud constantly while Apple consistently tries to find ways to achieve it locally, which is generally better for privacy and security but also cheaper for them too.
Yea thats why they look trough your images for “cp”
deleted by creator
Just because a certain requests don’t work offline, that doesn’t mean that Google isn’t actually running models locally for many requests.
My pixel isn’t new enough to run nano. What are some examples of offline processing not working?
I wouldn’t be surprised if the handshake between Pro and Nano was intermingled for certain requests. Some stuff done in the cloud, and some stuff done locally for speed - but if the internet is off, they kill the processing of the request entirely because half of the required platform isn’t available.
deleted by creator
What a thought provoking reply.
deleted by creator
You’re really going to say that Google isn’t doing anything locally with Tensor? That’s just silly.
deleted by creator