I'm sure the authors of malicious software will be certain to respect this clause
I mean, why even put that there?
IMHO these days it's far more about whether your binary is "known" than what it actually does. AV is a protection racket. I realised that long ago when "hello world" executables compiled with MSVC passed, but the same source code ran through GCC was deemed malicious and quickly deleted.
> Download gopacker from https://gopcker.dev
3. Pakcing the programs with gopacker
> After processing by our tool, 6 engines reported the program as malicious when without signing by *digigal* certificate.
(And features of Go assume the binary is not stripped, too. The symbol table is useful!)
> And this program has used -ldflags "-s -w" to remove the debug information and symbol information when compiling, which shows that if no processing is done for the program written in golang, there will be no secrets at all.
Though perhaps those flags were just not effective. Maybe Go places it's own stuff after the user-specified flags, or maybe they're only used in cgo mode?
Edit: https://words.filippo.io/shrink-your-go-binaries-with-this-o...
It retains the symbols needed to format stack traces, so most symbols remain. So go-link's "-s" is rather different from how normal linkers interpret "-s" (don't link symbols or debug information, functionally equivalent to strip(1)).
So, anyway, if you want to get a sense of if, in general, your binary is likely to trip “AV,” then, by all means, submit it to VT. If, however, you want to know if a specific AV detects it, submitting it (or worse, some other binary processed with the same packer) to VT is dumb.
0 - https://docs.virustotal.com/docs/antivirus-differs
1 - interestingly, some researchers found lower detection rates on VT, which they attributed to the vendors not using their cloud-based analysis modules. https://www.researchgate.net/publication/364073462_A_Compara...
If you chose to write your program in Go, it is likely not worth to copy. ;-)
libsodium has functions for all of that. Rust has the "secrets" crate that is a wrapper around these.
I don't know much Go, but a quick search looks like it has libraries that take care of these things as well - unsurprisingly.
And almost everything nowadays has easy-copy values, because almost everything copies all function parameters. It will be difficult, twitchy, and subject to whatever local compiler/interpreter optimizations as to whether or not any sort of code does or does not safely keep all values only in the "correct" space.
I wouldn't trust Go qua Go to keep my secrets only where I "put" them, and I wouldn't trust hardly any other modern language either. They're all based on copying arguments around. Rust is closest and I'm not sure I'd trust even that without a lot of checking, because while the language layer may have best-of-class controls over sharing, that's not to say the optimizer won't create copies of things under the hood. Those sharing controls are not, as I understand it, hardware-level promises as to how memory will be treated, just language-level promises. You almost have to reduce to a minimal kernel and program it in assembler, if you want to be sure the secrets can't flow, and that may be easier said than done depending on how complicated that kernel is. (e.g., simply reading something from a file and keeping it confined is easy, but if I have to do crypto with a secret that's a very large chunk of code to worry about.)
It isn't even just the software, even the hardware stack is just not designed for the CPU to not make copies. The hardware is designed to present an isolated view of the world to the software it is running on but it has numerous and large abstractions between the view of the world the software running normally sees and the actual state of the hardware. What you will see when those abstractions are penetrated (e.g., a straight-up RAM dump or disk dump) can be difficult to predict. A RAM page swapped to an SSD can physically reside on that SSD indefinitely because the SSD is remapping sectors continuously, you could get unlucky and that's the last thing ever written to that physical bit of the disk if the controller marks it bad. And that's just an example, not the complete list; I wouldn't count on madvise to protect me from all copies without a lot more research. Everything from the highest software layer to the lowest hardware layer is fighting you if you try to exercise this much control over where your data goes.
Every language have their memory exposed, GC has nothing to do with it.
The problem with a copying GC is that, even if you clear the memory which had for instance a secret key, it might have been copied from elsewhere as part of a heap compaction, and the old copies of that memory will not be cleared (until accidentally overwritten when the GC reuses that memory for something else). With manual memory management (or a non-copying GC), you can always manually erase the memory before releasing it (but even then, you have to make sure the compiler doesn't optimize out your clearing; for instance, you should use explicit_bzero() in C).
Rust has a variant of this issue: even though it doesn't have a GC, values tend to be moved in memory when transferring their ownership. This can be avoided by keeping them in the heap (within a Box), since it's the pointer to the heap (the Box struct) which tends to be moved, but you have to take care to never move the value out of the Box (or similar like Rc or Arc) before clearing it. The Pin struct might help by making it harder to move the value out of the Box (as long as you understand how to use it correctly; I've always found the Pin struct somewhat confusing).