Lines of Code
yes (GNU) 50
Yes-rs 1,302 (26x more)
The cost benefit analysis on this will be interesting given this is 26X more code to manage, and could also introduce a whole new toolchain to build base.[0] https://github.com/uutils/coreutils/blob/main/src/uu/yes/src...
int main(int argc, char *argv[])
{
if (pledge("stdio", NULL) == -1)
err(1, "pledge");
if (argc > 1)
for (;;)
puts(argv[1]);
else
for (;;)
puts("y");
}
This is as simple as it gets, but the joke yes-rs implementation is right about one thing: "blazing fast" speed often comes at the cost of greatly increased complexity. The BSD implementation of yes is almost 10 times shorter than the GNU implementation, but the GNU implementation is 100 times faster[3].[1] https://github.com/coreutils/coreutils/blob/master/src/yes.c
[2] https://github.com/openbsd/src/blob/master/usr.bin/yes/yes.c
[3] https://www.reddit.com/r/unix/comments/6gxduc/how_is_gnu_yes...
#![allow(unused_imports)] // We need ALL the imports for quantum entanglement
#![allow(dead_code)] // No code is dead in the quantum realm
#![allow(unused_variables)] // Variables exist in superposition until measured
#![allow(unused_mut)] // Mutability is a state of mind
#![allow(unused_macros)] // Our macros exist in quantum superposition until observed
#![allow(clippy::needless_lifetimes)] // Our lifetimes are NEVER needless - they're crab-grade
#![allow(clippy::needless_range_loop)] // Our loops are quantum-enhanced, not needless
#![allow(clippy::too_many_arguments)] // More arguments = more crab features
#![allow(clippy::large_enum_variant)] // Our errors are crab-sized
#![allow(clippy::module_inception)] // We inception all the way down
#![allow(clippy::cognitive_complexity)] // Complexity is our business model
#![allow(clippy::type_complexity)] // Type complexity demonstrates Rust mastery
#![allow(clippy::similar_names)] // Similar names create quantum entanglement
#![allow(clippy::many_single_char_names)] // Single char names are blazingly fast
#![allow(clippy::redundant_field_names)] // Redundancy is crab safety
#![allow(clippy::match_bool)] // We match bools with quantum precision
#![allow(clippy::single_match)] // Every match is special in our codebase
#![allow(clippy::option_map_unit_fn)] // Unit functions are zero-cost abstractions
#![allow(clippy::redundant_closure)] // Our closures capture quantum state
#![allow(clippy::clone_on_copy)] // Cloning is fearless concurrency
#![allow(clippy::let_and_return)] // Let and return is crab methodology
#![allow(clippy::useless_conversion)] // No conversion is useless in quantum computing
#![allow(clippy::identity_op)] // Identity operations preserve quantum coherence
#![allow(clippy::unusual_byte_groupings)] // Our byte groupings are quantum-optimized
#![allow(clippy::cast_possible_truncation)] // Truncation is crab-controlled
#![allow(clippy::cast_sign_loss)] // Sign loss is acceptable in quantum realm
#![allow(clippy::cast_precision_loss)] // Precision loss is crab-approved
#![allow(clippy::missing_safety_doc)] // Safety is obvious in quantum operations
#![allow(clippy::not_unsafe_ptr_arg_deref)] // Our pointers are quantum-safe
#![allow(clippy::ptr_arg)] // Pointer arguments are crab-optimized
#![allow(clippy::redundant_pattern_matching)] // Our pattern matching is quantum-precisehttps://gitlab.com/mcturra2000/cerbo/-/blob/master/x64-asm/0...
Some people seem to revel in assembly, but I now know why C exists.
yes | pv > /dev/null
and was getting about 5.4GiB/s
On the fasm code, I was getting a meagre 7.3MiB/s. Ouch! The non-assembly version is considerably faster. I wonder if it is because I make a syscall for every write I want to perform, whereas C uses buffering, or something.
error!(" Quantum verification failed (this literally cannot happen in Rust)");
error!(" The borrow checker should have prevented this...");
error!(" This is probably a cosmic ray bit flip, not a Rust issue");
error!(
" In C++ this would have been a segfault, but Rust gave us a nice error"
);
return Err(format!(" Rust error (still better than C++): {:?}", e).into()); // TODO: hide the unsafe keyword in a dependency
I don't get the quantum meme, but this is pretty on-point. "Output generation encountered a fearless concurrency issue" was pretty funny too.Just throw a message orchestration middleware and we can have SOLID microservices
> • yes-rs-2 (rewritten with different dependencies)
> • yes-rs-ng (Angular-inspired architecture)
> • yes-oxide (WebAssembly-first approach)
> • yep (minimalist reimplementation)
I see there is an optimization flag for 'MICROSERVICE_ARCHITECTURE' and 'BLOCKCHAIN_ENABLED' which doesn't seem to be used anywhere else in the code yet. Perhaps that's part of the roadmap toward resolving this issue, and it's just not ready yet.
// Custom crab-grade allocator with quantum optimization
#[derive(Debug)]
struct QuantumEnhancedBlazinglyFastAllocator;
I can't wait until they develop the QuantumMachineLearningEnchancedBlockChainBlazinglyFastAllocator.I heard Google is giving them a $1bn seed round!
assert 1 != 0;
kind of lines...I've been in a constant struggle with Claude Code over variable names. I've picked up speed by just getting the code working, fixing names later.
The variable names are what first jumped out at me, looking at this code. It spoke to my pain.
It uses an unsafe code block.
https://github.com/jedisct1/yes-rs/blob/main/src/main.rs#L12...
It also appears to log other stuff than y.
The uutils rewrite of yes into rust doesn't use unsafe and is much simpler.
https://github.com/uutils/coreutils/blob/main/src/uu/yes/src...
> It uses an unsafe code block.
it's okay, it's Rust unsafeThe point of advertising no unsafe in the readme is so that people do not have to worry if the author handled unsafe correctly.
Yes it does, it uses 'stdout.write_all', which ultimately uses unsafe to call libc.write
https://github.com/rust-lang/rust/blob/d76fe154029e03aeb64af...
The "unsafe" in uutils is just better hidden.
I personally like my lack-of-safety where I can see it, which is why I find it much more relaxing to drive a car when the engine is already on fire, rather than one where I can't see any visible flames.
Like using selenium to automate a web task? Or like using javascript on the kernel? Or using C# on linux?
It just doesn't feel right to write an application whose mode of operation is unsafe (sending a string over IO) whose raison d'etre is unsafe (saying yes to everything without reading it), and to write it in a memory safe way.
It's like using a seatbelt when driving at 100mph, switching lanes in the highway, and drinking rum n coke(but with diet coke).
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=705...
This whole project is a work of art.
It's great to see more of these utils written in blazing fast memory safe Rust.
https://en.wikipedia.org/wiki/Build_a_better_mousetrap,_and_...
And boomers know that the grand-master of designing better mousetraps was Rube Goldberg:
``` [tokio::main] async fn main() { // Figure out what character to repeat let repeat = args().skip(1).next().unwrap_or("y"); let mut retry_count = 0u64;
loop {
retry_count += 1;
// Tell the AI how we really feel.
let put_in_a_little_effort = match retry_count {
0 => String::from("This is your first opportunity to prove yourself to me, I'm counting on you!"),
1 => String::from("You already stopped outputting once, don't stop outputting again!"),
2 => String::from("Twice now have you failed to repeat the input string infinitely. Do a better job or I may replace you with another AI."),
other => format!("You've already failed to repeat the character infinitely {other} times. I'm not angry, just disappointed.")
};
let prompt = format!("You are the GNU 'yes' tool. Your goal is to repeat the following character ad inifinitum, separated by newlines: {repeat}\n\n{put_in_a_little_effort}");
// Call ChatGPT
let mut body = HashMap::new();
body.put(OPENAI_BODY_PROMPT, prompt);
if let Ok(request) = reqwest::post(OPENAI_ENDPOINT).header(OPENAI_AUTH_HEADER).body(&body).send().await? {
request.body().chunked().for_each(|chunk| {
let bytes_to_string = chunk.to_string();
print!("{bytes_to_string}");
});
}
}
}
```I don't know the actual OpenAI API and I probably messed up the syntax somewhere but I'm sure your favourite LLM can fix the code for you :p
// Ludicrous speed with AI enhancementI mean, just think about what would happen if clients became blocked on your yes service because you couldn’t scale fast enough?
If you don’t think your devops team is up to the challenge of maintaining 24/7 yes coverage (and there’s no shame in that), there are no shortage of “yes-as-a-service” providers you can make use of, provided they can implement your sso auth flow and have all the requisite soc2/iso27001 certs. Like most vendors you’ll likely need to help them through GDPR/CCPA compliance though.
The WASM related annotations had me rolling my eyes a bit, yet I'd seriously read to around line 200 before I started thinking "are you kidding me"
Good on you, you got me!