A sha512 hash in hex: 309ecc489c12d6eb4cc40f50c902f2b4d0ed77ee511a7c7a9bcd3ca86d4cd86f989dd35bc5ff499670da34255b45b0cfd830e81f605dcf7dc5542e93ae9cd76f
Same in this Base2048: ЗཟǷњϳݫЬߦՏԈ௰ڿƫ௪தͶޡഺཀވࡌੳٿ༲৩ত༥၄ঙџڸࠑحϷгଘƩƴߢய߅ϚƐγ๓ۑఞ
The encoding also seems to be HN-safe, using no emojis or other things HN filters. The font used here is lacking some of the characters, but that shouldn't matter if you just copy-paste.
Base2048: ЗཟǷњϳݫЬߦՏԈ௰ڿƫ௪தͶޡഺཀވࡌੳٿ༲৩ত༥၄ঙџڸࠑحϷгଘƩƴߢய߅ϚƐγ๓ۑఞ (47 characters, 113 bytes)
Base64: MJ7MSJwS1utMxA9QyQLytNDtd+5RGnx6m808qG1M2G+YndNbxf9JlnDaNCVbRbDP2DDoH2Bdz33FVC6TrpzXbw== (88 bytes)
Base58: yP4cqy7jmaRDzC2bmcGNZkuQb3VdftMk6YH7ynQ2Qw4zktKsyA9fk52xghNQNAdkpF9iFmFkKh2bNVG4kDWhsok (87 bytes)
Original: 64 bytes, 128 bytes in hex.
I'd prefer base58. Maybe compress it first.So, if I wanted to tweet the movie Spaceballs, encoded at 4K UHD (20MB/s), I estimate that I will need to make 20,945,455 tweets.
Twitter limits the posts made by nobodys (like me) to 2,400 tweets per day… so, it is going to take me just under 24 years to complete my task.
I’d better get started.
You can do that with base64 + gzip + (and that’s the important one) _wrapping the content in a url_.
Here’s pong (3.5kb) stored in a single tweet: https://twitter.com/rafalpast/status/1316836397903474688?s=2...
Source: I was bored, curious if I could turn twitter into a CDN
I don't know their use case but I was thinking more for malware command-and-control and red teaming.
At 2,400 tweets per day that's just over 9 days for base2048 or just under a week for base65536. Doable!
https://github.com/ctsrc/Base256
> Encode and decode data in base 256
> […]
> You might expect data encoded in base 256 to be more space efficient than data encoded in base 16, but with this particular set of symbols, that is not the case! Likewise, you have to type more, not less, than you would if you use my base 256 instead of base 16. So why?
> The purpose […] is to make manual input of binary data onto a computer less error-prone compared to typing in the base 16 or base 64 encoding of said data. Whereas manually typing out base 64 is painful, and base 16 makes it easy to lose track of where you are while typing, [this program] attempts to remedy both of these problems by using 256 different words from the EFF autocomplete-friendly wordlist.
Disclaimer: I am not using this base 256 program myself, even though I authored it. It just serves as a fun little experiment.
I was experimenting with using Twitter as a CDN, here’s pong (3.5kb) in a single Tweet:
https://twitter.com/rafalpast/status/1316836397903474688?s=2...
Example: https://twitter.com/jthecoder/status/1412848719737851905
My personal Musk dream is that he'll abolish that. If I never have to see tweets of pictures of text or tweets ending in `/n` again, it will be too soon.
Though, I sort of remember that Twitter was architected initially in a way that relied on the short tweets and it was a surprisingly complex change to even bump it up the little that they did recently, so who knows.
I guess bumping the limit up while keeping the ability to interact with all kinds of SMS systems and ancient phones was a challenge, or just required waiting a few years for some of the weirder systems to die out.
Interestingly when they doubled the character limit they also started double counting CJK characters so the limit is still effectively 140 for those languages.
I think there are some interesting observations in how both twitter and tiktok set out with short maximum lengths, establishing a culture of short, easy to digest messages, before relaxing the limit a bit. I'm not sure how much you can relax the limit before you turn the platform into something else entirely. But on the other hand there is a pattern of users circumventing the limit anyways. It will be interesting to watch how it develops over the years.