I would like to use this model on my ipad, but can't find it in the appstore, can someone please help me?

#8
by evewashere - opened

also looking to run this locally on my samsung smart fridge, pls fix

Oh, it's super easy! They just forget to mention in the README.

  1. Get an iPad.

  2. Pick up a few dozen NVIDIA H200s. (Don't worry, they're on sale at your local Best Buy for a few bucks each.)

  3. You'll need a small, liquid-cooled, data center to power them. (You can probably just clear out your basement.)

  4. Set up the whole thing in the back of your Samsung Smart Fridge or iPad for optimal thermal performance.

  5. Once all that's done, just refresh the App Store page. It should pop right up next to Sora's new app. You're welcome!

may work from ssd ,sdcards but with very very slow inference speed ,if active parameters are lower this speedup ,use llama.cpp with mmap and terminal app(termux on android devices),i do not have apple device to test but on android device may work,faster ssd /sdcard faster inference,so problem is in low read speed of ssd and sdcards available

Sign up or log in to comment