Lee Duna@lemmy.nz to Technology@beehaw.orgEnglish · 11 months agoApple wants AI to run directly on its hardware instead of in the cloudarstechnica.comexternal-linkmessage-square26fedilinkarrow-up10arrow-down10
arrow-up10arrow-down1external-linkApple wants AI to run directly on its hardware instead of in the cloudarstechnica.comLee Duna@lemmy.nz to Technology@beehaw.orgEnglish · 11 months agomessage-square26fedilink
minus-squareQuokka@quokk.aulinkfedilinkEnglisharrow-up0·11 months agoYou can already run a llm natively on Android devices.
minus-squareVibrose@programming.devlinkfedilinkarrow-up0·11 months agoYou can on iOS as well! https://apps.apple.com/us/app/private-llm/id6448106860
minus-squaresnowe@programming.devlinkfedilinkarrow-up0·11 months agoThe hard part isn’t running ai on a device… it’s doing so while retaining battery life, performance, and privacy.
minus-squareJackGreenEarth@lemm.eelinkfedilinkEnglisharrow-up0·11 months agoWhich one do you use? I tried MLCChat, but all 3 times it either showed a java error or generated giberrish, what’s worked for you?
You can already run a llm natively on Android devices.
You can on iOS as well!
https://apps.apple.com/us/app/private-llm/id6448106860
The hard part isn’t running ai on a device… it’s doing so while retaining battery life, performance, and privacy.
Which one do you use? I tried MLCChat, but all 3 times it either showed a java error or generated giberrish, what’s worked for you?