How do android users cope with the fact iOS 18 will be the biggest revolution in phones with a bunch of ai features running locally? Imagine needing the cloud for your ai kek
CRIME Shirt $21.68 |
CRIME Shirt $21.68 |
How do android users cope with the fact iOS 18 will be the biggest revolution in phones with a bunch of ai features running locally? Imagine needing the cloud for your ai kek
CRIME Shirt $21.68 |
CRIME Shirt $21.68 |
imagine shitting on the street
Sirs...
That's great. Show similar for profits?
Irrelevant
If you're building apps, it's very relevant. What's the point of getting 19X more exposure to a market that has 1000X less money? You think it's some accident that Insta and Snap and a bunch of other high profile apps debuted on iOS and stayed that way for years?
Completely irrelevant. The ones who shit on the street are the same ones who use android (Indians)
>iOS and android markets hares grew in pooinloo land
What did he mean by this?
Swap?
Yup, they really like beating their cash cow.
This fricking cancerous shit caused my phone to freeze and hang. Kys and go to hell.
>caused my phone to freeze and hang
lmao, poorgay phone Black folk BLOWN THE FRICK OUT
>Nooooo, I must cunsoooom and pay $1199 for a damn cellphone.
I currently have an excellent Samsung A03 and I do not need more.
sounds like you should throw it in the bin and buy a real computer instead of being a phoneBlack person imo
Hmm, alright. Look, I'll tell you what. I currently have around $500 that I can spend. Please suggest a decent laptop. Also, I'd like to apologize for the very rude language I had used in my first message. I was really annoyed at my phone freezing up and losing many tabs that were open. On the phone, hence I lost my cool. So sorry for that.
no need to apologize on a chinese basket weaving forum my fren
>Please suggest a decent laptop
dont know much about laptops m9, maybe just get one of those cheap miniforums pcs for a few hundred, unfortunately it wont have a screen meaning it will hinder your ability to shitpost on the go
>a03
holy shit are you a third worlder or a Black person?
even poor people dont buy shitty laggy ewaste like this
Why the frick is this a webm?
see
He's found some webm that furryfox can't play so he's been converting every png he has into it
wdym? firefox plays it just fine
>troonyfox
KEK, I would feel humiliated and deeply ashamed if anybody found out I used that thing. You could have at least blurred that part out.
2 more weeks for proper HDR support, right?
>flash memory utilization technique
so.. they plugged in a fricking SD card and use it as shitty slow RAM?
You do realise SSDs are flash memory? Fricking idiot
>Sam "Frick me in the ass cuz I'm a homosexual" Altman asking for $7trillion for LLM training costs
>vs
>Apple chads figuring out efficient ways to run LLMs down to the level of running it on a fricking smart phone locally
Steve Job's performance per watt approach to tech improvement lives on till this day and Apple is still winning.
this
?si=_j7Ai1kGD-HrOEqD
What is this going to do for iPhags? Let them write texts with eyeball motions or something?
"AI" has no functional purpose, it's just another feature they can talk about to convince you that there's a real reason to buy the latest shit.
iPhone had built-in NPU called Neural Engine since iPhone X
and latest neural engine in A17 Pro has AI operation capability of 35Tflops
Will the SE 2022 be capable of handling all this AI stuff? Im a poorgay who doesn't want to burn money on an iphone 12/13
yes it has A15 which perform AI operation in 15.8Tflops
You morons will literally argue about anything. Not even bot is this bad.
BOT is definitely not that bad. Most still may see a good underlying game design even in a shit series they don't like. Here it's all shitflinging tribalism for the sake of it.
Even we had a palworld thread because based on a twitter post, they're too moronic for sw based version control
Why are you saying this as if it's some kinda boon? I don't want or need AI running locally on my phone..
Onboard image denoising (ie waifu2x or better) would help convert those bloated JPGs to 10-bit AVIFs. 90% file size reduction right there. Though it'll be sad if hardware encoders get limited to 8-bit, then you would only get like 50% file size reduction which would be pretty meh.
>the avigay is here
please shut up. avif is doa
Yeah, it's not like websites that cater to hundreds of millions of users are starting to use AVIF right now. That would be crazy, huh?
https://www.bilibili.com/
can't wait for Apple to invent AI!
I'm pretty sure qualcomms latest SoCs also have an AI accelerator with similar TOPs performance. Pretty sure all the assistant apps like Siri, Cortana, Home, Alexa, Bixby, etc will be LLM based from now on.
>Google invent LLM
>Google put Voice assistant local on pixels years before iPhone
But tiny Apple research publish a paper :-O
What happens if I ask my iphone about which group is most likely to steal an iphone?
already a thing
https://apps.apple.com/us/app/private-llm/id6448106860
>a bunch of ai features running locally?
All that does is eat through your battery and produce subpar results.
The core of the LLM in a Flash paper is the use of a low rank predictor to predict which RELU activated outputs of the FFN are active.
Now look here : https://arxiv.org/abs/1312.4461
"Scalability properties of deep neural networks raise key research questions, particularly as the problems considered become larger and more challenging. This paper expands on the idea of conditional computation introduced by Bengio, et. al., where the nodes of a deep network are augmented by a set of gating units that determine when a node should be calculated. By factorizing the weight matrix into a low-rank approximation, an estimation of the sign of the pre-nonlinearity activation can be efficiently obtained. For networks using rectified-linear hidden units, this implies that the computation of a hidden unit with an estimated negative pre-nonlinearity can be ommitted altogether, as its value will become zero when nonlinearity is applied. For sparse neural networks, this can result in considerable speed gains. Experimental results using the MNIST and SVHN data sets with a fully-connected deep neural network demonstrate the performance robustness of the proposed scheme with respect to the error introduced by the conditional computation process.
Apple did not cite the paper, I'm going to guess because it would have made the patent application hard.
The groundbreaking method is reducing float operations to int8
Apple Black folk did it again
APPLE INVENTED SWAP FILE
think iphone is special? when a whole shitload of AI accelerator chips are hitting the market any day now
>imagine needing an AI for the most trivial shit and constant spying
I also have an over-engineered slab of glass.
Enjoy your ai slop
I will.
I can't wait to search some cooking recipe on my phone and have some LLM eat up my battery life to give me an answer.
>battery life: 30 minutes
I don't particularly care thanks
"breakthrough method"
let me guess, they're pretending they invented cloud computing now
They're a devices company first. They've always talked about wanting AI on the user end as much as possible, not the cloud.
Okay, I'll bite, give me a single good reason why I would want AI anything on my phone?
You should want anything on your end, if given the choice. That's the whole point of PCs in general. AI challenges a lot of this philosophy, but it should be followed when possible.
The operative words aren't "on my phone", it's "AI".
Basically this, everything I've ever seen about AI is pure 1 use gimmicks.
Finally a Ai gf on my iPhone!