"Over the past few years, several AI-powered features have been added to mobile phones that allow users to better search and understand their messages. One effect of this change is increased 0-click attack surface, as efficient analysis often requires message media to be decoded before the message is opened by the user"
Haven't we learned our lesson on this? Don't read and act on my sms messages without me asking you to!
What is the purported lesson we should have learned? Users choose phones with rich messaging features. This was a major selling point for iPhone, first, with iMessage, and later with Android until iOS caught up with RCS.
Somewhere there's an NSA agent reading this and laughing like a gin addict on payday.
This makes me feel better about Google, but also makes me kind of frightened of the rest of Android. I wonder what Apple's response time is?
Feels like there’s something new every other day - linux, windows, mobile, various commonplace tools used by everybody, the list goes on
```
does this look right to you? don't do any searches or check memory, just think through first principles
static int vpu_mmap(struct file fp, struct vm_area_struct vm) { unsigned long pfn; struct vpu_core core = container_of(fp->f_inode->i_cdev, struct vpu_core, cdev); vm_flags_set(vm, VM_IO | VM_DONTEXPAND | VM_DONTDUMP); / This is a CSRs mapping, use pgprot_device */ vm->vm_page_prot = pgprot_device(vm->vm_page_prot); pfn = core->paddr >> PAGE_SHIFT; return remap_pfn_range(vm, vm->vm_start, pfn, vm->vm_end-vm->vm_start, vm->vm_page_prot) ? -EAGAIN : 0; }
```
And it correctly identified the issue at hand, without web searches. I'd love to try something more comprehensive, e.g. shoving whole chunks of the codebase into the prompt instead of just the specific function, but it seems the latent ability to catch security exploits is there.
So then.... I wonder how this got out in the first place. I know I'm using a toy example but would love to learn more!
Yes, they certainly would. You wouldn't have smartphones, for instance.
I can't tell if this is satirical or not. But there are so many takes like this recently (hold the website liable for user content, hold the corporate developer liable for zero days in a project they happened to touch) that would all result in the same outcome (no more product at all) that I can't help but wonder if there's some luddite psy-op trying desperately to bring us back to a pre-Internet era in any way they can...
It does make me scared for what other dangers lurk since this was a really bad one and it was so little work to find.
Also of note: so many security issues lately have been done using AI. This report makes me think two things:
1. Expertise is still immensely valuable, the more niche, the more valuable.
2. There are lots of niches still where AI doesn't dominate...
If this is the case it's good news for everyone else besides NSO and Co
Also, in contrast to iPhones, Android traditionally relies a lot more on safe languages like Java and Kotlin (and now Rust). Of course, iOS is improving there as well with Swift.
The issue is that all other Android vendors outside Google Pixel and to some extend Samsung are just terrible when it comes to device security.
Finally, it should be said that iOS was also compromised relatively quickly according to leaked Cellebrite presentations. The only system they could not compromise at the time was GrapheneOS, because they fully use Pixel hardware security features and do a lot of additional mitigations (including many that iOS doesn't use).
Also, any discussion of iOS should come with a fat disclaimer that by default iOS devices have a huge hole: most people use iCloud Backups (and are nudged towards it) without ADP, so their iCloud backups are not end-to-end encrypted and their chats, etc. can be requested by law enforcement. That you yourself use ADP does not really matter if the people you are communicating with don't. Also, Apple manages the key dictionary for iMessage, etc. so they could insert themselves. I would not be surprised if default non-E2E backups are a compromise in the extension of the NSA PRISM program that Apple already participated in before the Snowden leaks.
Of course, Google isn't any better, but just to say that Apple's security/privacy story is selective. Yes, they help protecting against some malicious groups and non-allied states, but they also make sure that US law enforcement (and probably some allied powers) can access most data.
in fact apple fixed several high criticality bugs like these not that long ago - they just dont talk about it other than "you must fix now".
same problems, different comms, and the more people do this, the less transparent google will be.
So something like the old iphone jailbreaking scene is just impossible now.
By definition in Rust it's incorrect to overflow the non-overflowing integer types, and so if you intend say wrapping you should use the explicit wrapping operations such as wrapping_add or the Wrapping<T> types in which the default operators do wrap - but if you turn off checks then it's still safe to be wrong, just as if you'd call the wrapping operations by hand instead of using the non-wrapping operations.
That Dolby overflow code looks awkward enough that I can't imagine writing it in Rust even if the checking was off - but I wasn't there. However the reason it's on Project Zero is that it resulted in a bounds miss, and that Rust would have prevented anyway.
I think Zig has the most interesting approach here with 3 different "+" operators (+ aborts on overflow, +& wraps, and +| saturates) along with addWithOverflow builtin. It'd probably be a challenge for Rust to adopt that at this point, but it'd be a great improvement
That is not a solution because it means the code can behave differently, and expose vulnerability if wrong compilation settings are chosen.
The functions like "wrapping_add" have such a long names so that nobody wants to use them and they make the code ugly. Instead, "+" should be used for addition with exceptions, and something like "wrap+" or "<+>" or "[+]" used for wrapping addition.
That's how people work, they will choose the laziest path (the simplest function name) and this is why you should use "+" for safer, non-wrapping addition and make the symbol for wrapping addition long and unattractive. Make writing unsafe code harder. This is just basic psychology.
C has the same problem, they have functions checking for overflow, but they also have long and ugly names that discourage their use.
> modern hardware will just wrap if you don't check and that's cheaper
So you suggest that because x86 is a poorly designed architecture, we should adapt programing languages to its poor design? x86 will be gone sooner or later anyway.
Also, there are languages like JS, Python, Swift which chose the right path, it is only C and Rust developers who seem to be backwards.
So all operations should be function calls imo. There is not much point in having operators
If the software is correct nothing changes. The existence of people who write nonsense but expect you to work around that doesn't change between languages, they write crap in Swift or Python or Javascript just the same.
The long names are because there are, in fact, a lot of things you might want. Although Swift manages to take several pages and lots of diagrams to explain what wrapping is, that is in fact all their special operators do. What if you don't want wrapping? Too bad.
Rust provides saturating, which is almost always what you wanted for signal processing (e.g. audio) as well as separate "carry" booleans to do arithmetic the way you were probably shown in primary school, the wrapping most often provided on hardware and useful in cryptography among other places and explicitly cheap but dangerous and expensive but safe options. It also provides both kinds of division (and remainder), which doesn't matter for the unsigned integers but is important for signed integers and is a source of confusion and woe when languages provide only one kind or worse a mixture that makes no mathematical sense. These all need names.
This is a very C-flavoured "solution". For those who haven't seen it this involves a pointer (!) and we're going to compute the addition, write the result to the pointed-at integer and then if that didn't fit and so it overflowed we'll return true otherwise false.
The closest Rust analogy would be T::carrying_add which returns a pair to achieve a similar result.
And yeah, checking is "basically free" unless it isn't, that's not different. If you haven't measured you don't know, same in every programming language.
It's never been true that you can't write correct software in C or C++ the problem is that in practice you won't do so.
(Cangjie seems like a pretty nice language in other ways as well. Similar to Kotlin with some improvements and no Java. Bootstrapping the toolchain from source seems difficult though.)
[1] https://docs.cangjie-lang.cn/en/docs/0.53.13/white_paper/sou...
[2] https://docs.cangjie-lang.cn/en/docs/0.53.13/spec/source_en/...
We've moved slightly closer to this, but in a world where we're still arguing over memory safety being necessary we've probably still got a ways to go before we notice that addition silently overflowing is a top-10 security issue. It's the silent top-10 security issue, I guess.
That said you can enable overflow checks in Rust's release mode. It's literally two lines:
[profile.release]
overflow-checks = true
I wonder if it would make sense for ISAs to have trapping versions of add and subtract. RISC-V's justification for not doing that is that it's only a couple more instructions to check afterwards. It would be interesting to see the performance difference of `overflow-check = true` on high performance RISC-V chips once they are available.Second, it's easy to say "trap on overflow" but traps are super annoying. You really ideally would want to avoid leaving user mode. As soon as you trap to the OS you're now dealing with signals which are pretty much the worst thing in the world. The 4 instruction case at least lets you just branch to other code.
So you ideally want an "add or branch" instruction, but there isn't enough space in the opcodes for that. The fallback is flags, which also massively suck. I don't know if anyone has a great solution to this problem.
MIPS does (did?). And VAX, IBM/360, ....
That said, CHERI is super complicated. Checked integer arithmetic operations would be way simpler.
OpenBSD fixed this back in 2017.
https://git.kernel.org/pub/scm/linux/kernel/git/arm64/linux....
Now imagine the dark horrors hiding in the BSPs of other Android devices... or embedded devices in general.
Frankly, it should be a requirement of Google's certification process that everything regarding drivers gets upstreamed into the Linux kernel. Yes, even if this adds quite a time delay to the usual hardware development process.
Here's a cool project that inventories all your KASLR info leaks: https://github.com/bcoles/kasld
People love new technologies and features that make their lives easier, but so far only a small subset of these people have made a conscious decision to limit their exposure to risk by depriving themselves of benefits provided by some of these features.
It sure is wonderful to have your whole life digitized on a single computer. You can analyze, share, organize, gamify, record and so on every aspect of your life instantly and effortlessly. It's incredible, really. Technology is amazing. Expect for the pesky bad actors that can do the digital equivalent to most physical crime from the other side of the world anonymously without you noticing.
It's like germs - if you don't wash your hands after touching something questionable and you don't experience any negative consequences, you'll learn not to wash them most of the time. It's just a waste of time. Maybe if you've touched something really gross, you'd wash them, but that would be the exception. Security is the same. If you've been using computers the same way for years, you'll learn nothing bad happens so why bother having a hygiene, why bother making any tradeoffs?
Yes, you've heard the news of someone's nudes posted online, of someone's bank account drained or of some company's files ransomed, but you've also heard of something dying from a brain parasite after touching a muddy puddle and rubbing their eyes afterwards. That happens rarely, we shouldn't worry about it. A car can hit you when you cross the street, a lightning can strike you when you're just walking about, an aneurysm can end you at anytime. No one is washing their hands all the time or constantly trying to minimize the streets they cross or anything like that. That would be foolish and impractical, and I agree.
That mindset is carried over to digital security, sadly. The risks are higher, the effort to keep a good hygiene is lower, the ability for bad actors to completely fuck you is much greater than in meat space. The rewards are seemingly greater, too, until we realize that what we get from technology is just marginally better than what we get without it. Tech is amazing, but it doesn't make us transcend time and space. It let's us organize our schedule, tag people and places in photos and summarize chats. All of that is born out of meat space. Without tech we'd still have conversation, we'd still see new places, we'd still have calendars and todo lists. We get maybe 1% more than we would have if we didn't have any tech but we let all our information and property sit unsecured for that 1% gain. That's fucked up, because the risks are big and will get bigger. And the tradeoffs we have to make to secure our digital lives may seem annoying, but are actually quite trivial. Less unnecessary sharing, more isolation and compartmentalization, different computers for different tasks, less proprietary hardware and software, etc.. We could get 90% of that 1% benefit from tech if we spend just a bit of time and energy of securing out digital lives. But fuck it. Let's but the latest flagship, let's use it for ID, banking, communication, file storage, camera, health tracking, everything. Because it's a tiny bit more inconvenient to get multiple computers for different purposes, to not get the latest and newest, to not install a bunch of unnecessary shit, to be careful about the digital realm at all.
Not really on topic, but a rant. I'm tired of people (friends and friends of friends) complaining to me that they got majorly fucked one way or another and acting like the universe owes them not to get fucked while they buy a computer that exposes their asshole to the world.
TLDR: People are lazy about digital security, get badly burned, then act surprised. Don't be that person.