In 1975 (age 11) I had a Summer job at a data processing company watching their IBM System/3 running payroll and inventory listing jobs and managing the fanfold paper off the printer while the operators went out to get drunk at bars. Also re-punch H cards in hundreds of RPGII decks to reflect a software change. Best child labor ever!
There was a card deck that printed Snoopy calendars for any year. Above the calendar part was ASCII art of the classic Peanuts scene with Schroeder at the piano, Snoopy and Linus and Lucy dancing.
EDIT: I guess you'd have to call it EBCDIC art not ASCII art! Only a few people get this joke now.
I think it started when someone posted that "make 8 bit great again" post. It wasn't a joke or anything, he just found cool pixel art that he wanted to share. After that it appearss this someone made a gimmick out of it with this thread, by posting an ASCII image and pretending is 7 bit. It's not, it's just one less than the last post, (8-1 = 7), thereby gimmickifying the post. After this the "one bite" and "two bite" posts that cleverly continued the gimmick and referenced Bidens odd habit of nibbling things in broad daylight.
It seems like thedonald.win is fairly unfamiliar with the gimmick train.
Basically it is when a non joking forum post has its title repeated, altered, or played upon by other posts, to the point where it ends up being a train of posts all sharing a basic trait.
For ex.
Post a, the OG. : I LOVE APPLES and ORANGES
post b. I LOVE FRIENDS and FAMILY
post c. I LOVE GIMMICKS and POSTTRAINS
Or it could be something like..
Post a OG: don't you love nintendo 64 and chilling out?
Post b: dont you love playstation 2 and chilling out?
Post c. Don't you love okagamesphere and blasting into space?
Post a, the OG post...
Man I can not believe it, I just told some random chick she was hot!
Post b. Ewww, some random ugly guy just came up to me and told me I was hot... disgusting
Post c. I saw the funniest shit, some ugly guy told an ugly girl she was hot and she all pissy about it.
Look out for this being the answer when you have no idea what's going on and no one else will explain.
ASCII is defined as a 7 bit encoding. Most computers saved the 7 bits for each character into bytes instead, because bytes where the fundamental data type being used. Since there was an extra bit at the beginning of every byte, operating systems like DOS added extra glyphs to the 128-255 range of the byte and called it High ASCII. This is what was used for the extra symbols, most of them where for outputting to the screen to make better looking text UI elements. With UTF-8 those same 7 bits are the same as ASCII but instead of using the first bit for High ASCII it tells it that it's a multi-byte character and that the next byte is part of the current character instead of being a separate one.
TL;DR, I see. Also unable to recognize a bit of fun. Not a good sign.
It's ASCII (7 bit) data. It's still 7 bit encoded ASCII no matter what wrapper you deliver it in. The same would hold if it were 5 bit Baudot encoded data.
It's hard to explain a joke and not come off like a dbag with no sense of humor. I took the internet image hit for those who were confused.
I was simply explaining the randomness of the joke, my bad that I didn't know ASCII = 7 bit.
Last time I ever make that mistake.
You're not using it tho. Its bloat. A good murican system would save one byte for every 8 sent. But the byte alignment issues are a pita to handle so ill give you that. Efficiency is beautiful but real coding today relies on the fact that, by 1985 standards we all carry supercomputers in our pocket. Im being pedantic here but using the ascii 7bit charset to make this render is true to OPs title
You're right. You're probably using UTF-8. Which, conveniently, is a superset of 8-bit ASCII.
A good murican system would save one byte for every 8 sent.
7-bit encoding means that that none of the bytes in a stream have their most significant bit set. You're not going to send a 7 bit byte over the wire (you can't).
But it gets better. In GSM-7, you need two characters to encode certain other characters (like a curly brace or square bracket), so 7-bit encoding is actually worse for efficiency!
In other words, a hypothetical 7-bit system would often have to send more data, not less, to encode some messages.
But the byte alignment issues are a pita to handle so ill give you that.
Not a byte alignment issue. 7-bit bytes don't exist.
but using the ascii 7bit charset to make this render is true to OPs title
Well, yes, because 7-bit encoding doesn't mean you're sending 7-bit bytes. It just means you're using ASCII without an extended character set. So OP is correct regardless of word size.
It's just an encoding scheme. That's all. Nothing more.
What i meant was, you could send 8 bit bytes, but using a 7bit encoding, so you'd encode 8 characters into every 7 bytes. But i really dont care ill send fucking utf16 encoded text and not give a fuck bc we live in the future where computers can do everything fast except run poorly written javascript
Unless OP made that 2016-looking render of DJT on a computer that’s older than 25 years, that sucker’s eight-bit. You’d have to jump through some hoops to shave that bit. Maybe there’s a site or some software that will crank out a 7-bit version and save that essential pre 1985 space.
It's ASCII (7 bit) data. It's still 7 bit encoded ASCII no matter what wrapper you deliver it in. The same would hold if it were 5 bit Baudot encoded data.
Modern clients usually default to UTF-8. 8-bit ASCII if it's multipart and includes a text/plain part, but the HTML-formatted message will usually be UTF-8 encoded as quoted printable or base64.
No way, it's great job security. Just ask anyone having to convert a Python 2.x codebase to Python 3.x.
Or when a big site like GitHub transliterates [email protected]ıthub.com to [email protected] but still sends the password reset email to gıthub.com it makes for some fun.
Granted, BTDT. But IMO it's a canard. I often wish the sweatshop coders (.in, for example) working for the body shops working for the 3- and 4- letter companies would spend as much time on error handling & recovery as they spend on transliteration.
Probably because the transliteration is often handled in most libraries, sometimes transparently, and sometimes unexpectedly when handling different encodings[1].
Error handling and recovery from said errors requires actually thinking about the problem, writing unit tests, and real work. The overseas Java/whatever shops are looking mostly at volume not quality, so we shouldn't be hugely surprised that they're not going to focus on real work.
[1] Granted, "transparent" doesn't always mean "correct." I've used unidecode in Python for this purpose, which I think handles things more sanely, generally speaking.
Agreed. The real cost, though, shows up when you're trying to keep a site like Citi or AmEx up when it's full of this kind of code. If they've learned their lesson, they write some heavy language into the testing and LTS sections of their contracts now.
Ironically, American companies like Computer Associates (now CA Technologies, a subsidiary of Broadcom) were the pioneers of this business technique.
Seems like a waste that "bill clinton is a rapist. Www.infowars.com" wasn't hidden in there somewhere. Or it was a aaste for me to spend so long looking for it.
Line printer art. Loved that stuff. We had a pair of CDC-580 series line printers. One printer was uppercase only and one mixed-case. At the peak of our paper usage we did just short of a million lines on the mixed-case printer and a million point 6 on the uppercase only printer (more repeat characters in the print train). This was in a 24-hour period. Those things were brutes.
(This was all back in mainframe days before personal computers were at all common.)
We used to crank out this kind of stuff on green-and-white tractor feed paper froms using the Burroughs B6800 mainframe that ran our university network. Also had a great multi-user Star Trek game we got play when no one was on the network.
We were genuinely excited when we got the B6800 upgraded to 512KB of RAM.
Buncha slack jawed faggots around here
These memes will make you a sexual magasaurus. Just like DJT.....
I ain't got time to reee
Son of a bitch is dug in there deeper than a Democrat politician.
What's the matter, Eric? The CIA got you fucking too many dildos?
Are these slack-jawed yokels named Cletus?
ASCII WHAT YOU DID THERE
YOU!
What's with the bit posts? What did I miss?
I can’t keep up
It's for us old pedes from the 80s and early 90s. This was high tech printing back in the day!
In 1975 (age 11) I had a Summer job at a data processing company watching their IBM System/3 running payroll and inventory listing jobs and managing the fanfold paper off the printer while the operators went out to get drunk at bars. Also re-punch H cards in hundreds of RPGII decks to reflect a software change. Best child labor ever!
There was a card deck that printed Snoopy calendars for any year. Above the calendar part was ASCII art of the classic Peanuts scene with Schroeder at the piano, Snoopy and Linus and Lucy dancing.
EDIT: I guess you'd have to call it EBCDIC art not ASCII art! Only a few people get this joke now.
idk, but it's fucking stupid that not 1 person can explain it. Seems like some comment someone made. "you 2 bit...etc."
I think it started when someone posted that "make 8 bit great again" post. It wasn't a joke or anything, he just found cool pixel art that he wanted to share. After that it appearss this someone made a gimmick out of it with this thread, by posting an ASCII image and pretending is 7 bit. It's not, it's just one less than the last post, (8-1 = 7), thereby gimmickifying the post. After this the "one bite" and "two bite" posts that cleverly continued the gimmick and referenced Bidens odd habit of nibbling things in broad daylight.
It seems like thedonald.win is fairly unfamiliar with the gimmick train. Basically it is when a non joking forum post has its title repeated, altered, or played upon by other posts, to the point where it ends up being a train of posts all sharing a basic trait.
For ex.
Post a, the OG. : I LOVE APPLES and ORANGES
post b. I LOVE FRIENDS and FAMILY
post c. I LOVE GIMMICKS and POSTTRAINS
Or it could be something like..
Post a OG: don't you love nintendo 64 and chilling out?
Post b: dont you love playstation 2 and chilling out?
Post c. Don't you love okagamesphere and blasting into space?
Post a, the OG post...
Man I can not believe it, I just told some random chick she was hot!
Post b. Ewww, some random ugly guy just came up to me and told me I was hot... disgusting
Post c. I saw the funniest shit, some ugly guy told an ugly girl she was hot and she all pissy about it.
Look out for this being the answer when you have no idea what's going on and no one else will explain.
ASCII is defined as a 7 bit encoding. Most computers saved the 7 bits for each character into bytes instead, because bytes where the fundamental data type being used. Since there was an extra bit at the beginning of every byte, operating systems like DOS added extra glyphs to the 128-255 range of the byte and called it High ASCII. This is what was used for the extra symbols, most of them where for outputting to the screen to make better looking text UI elements. With UTF-8 those same 7 bits are the same as ASCII but instead of using the first bit for High ASCII it tells it that it's a multi-byte character and that the next byte is part of the current character instead of being a separate one.
https://www.asciitable.com/
TL;DR, I see. Also unable to recognize a bit of fun. Not a good sign.
It's hard to explain a joke and not come off like a dbag with no sense of humor. I took the internet image hit for those who were confused. I was simply explaining the randomness of the joke, my bad that I didn't know ASCII = 7 bit. Last time I ever make that mistake.
2 bits used to be a reference to a US Quarter.
We are having a secret pattern recognition contest on .win tonight. 🤣
Seriously what's with the bit jokes?
Does there really need to be a reason?
And that's the best answer! Thanks OP!!!
I'm confused too
2 0 2 0 W A V E
That’s ASCII, which is technically 8-bit but it’s all good, nice render.
Fuck whoever downvoted you. 8bit ascii is in fact retarded lol
You're gonna send a full byte over the wire anyway. Might as well use the last bit.
You're not using it tho. Its bloat. A good murican system would save one byte for every 8 sent. But the byte alignment issues are a pita to handle so ill give you that. Efficiency is beautiful but real coding today relies on the fact that, by 1985 standards we all carry supercomputers in our pocket. Im being pedantic here but using the ascii 7bit charset to make this render is true to OPs title
You're right. You're probably using UTF-8. Which, conveniently, is a superset of 8-bit ASCII.
7-bit encoding means that that none of the bytes in a stream have their most significant bit set. You're not going to send a 7 bit byte over the wire (you can't).
In fact, 8-bit bytes have been a thing since the 1970s and other byte sizes have long since been extinct.
But it gets better. In GSM-7, you need two characters to encode certain other characters (like a curly brace or square bracket), so 7-bit encoding is actually worse for efficiency!
In other words, a hypothetical 7-bit system would often have to send more data, not less, to encode some messages.
Not a byte alignment issue. 7-bit bytes don't exist.
Well, yes, because 7-bit encoding doesn't mean you're sending 7-bit bytes. It just means you're using ASCII without an extended character set. So OP is correct regardless of word size.
It's just an encoding scheme. That's all. Nothing more.
What i meant was, you could send 8 bit bytes, but using a 7bit encoding, so you'd encode 8 characters into every 7 bytes. But i really dont care ill send fucking utf16 encoded text and not give a fuck bc we live in the future where computers can do everything fast except run poorly written javascript
That bit was checksum in CDC world
Unless OP made that 2016-looking render of DJT on a computer that’s older than 25 years, that sucker’s eight-bit. You’d have to jump through some hoops to shave that bit. Maybe there’s a site or some software that will crank out a 7-bit version and save that essential pre 1985 space.
It's ASCII (7 bit) data. It's still 7 bit encoded ASCII no matter what wrapper you deliver it in. The same would hold if it were 5 bit Baudot encoded data.
Doesn't email default to 7bit ascii?
Modern clients usually default to UTF-8. 8-bit ASCII if it's multipart and includes a text/plain part, but the HTML-formatted message will usually be UTF-8 encoded as quoted printable or base64.
We have the best computer programmers, don't we? Thanks for the info, glad i dont have to deal with email, wish i could never look at css again lol.
No chars above #7F used in the pic, so 7 bits is enuf.
EDIT: It would have been more confusing to do it in 5 bit radioteletype 👍
Not so fast...ASCII has 2 modes. Normal and special character. Normal is 7 bit.
Hahaha I was gonna say
Even RGB's modem can handle this
But her mind can't handle this. Where's she now. :/
She needs more RAM
Did Barron make this? I heard he's good with the cyber.
THE EXPERT
Word. None of this globalist Unicode bullshit. 🤣 <---
OP you're my hero
No way, it's great job security. Just ask anyone having to convert a Python 2.x codebase to Python 3.x.
Or when a big site like GitHub transliterates
[email protected]ıthub.comto[email protected]but still sends the password reset email togıthub.comit makes for some fun.Granted, BTDT. But IMO it's a canard. I often wish the sweatshop coders (.in, for example) working for the body shops working for the 3- and 4- letter companies would spend as much time on error handling & recovery as they spend on transliteration.
Probably because the transliteration is often handled in most libraries, sometimes transparently, and sometimes unexpectedly when handling different encodings[1].
Error handling and recovery from said errors requires actually thinking about the problem, writing unit tests, and real work. The overseas Java/whatever shops are looking mostly at volume not quality, so we shouldn't be hugely surprised that they're not going to focus on real work.
[1] Granted, "transparent" doesn't always mean "correct." I've used unidecode in Python for this purpose, which I think handles things more sanely, generally speaking.
Agreed. The real cost, though, shows up when you're trying to keep a site like Citi or AmEx up when it's full of this kind of code. If they've learned their lesson, they write some heavy language into the testing and LTS sections of their contracts now.
Ironically, American companies like Computer Associates (now CA Technologies, a subsidiary of Broadcom) were the pioneers of this business technique.
I'm from the 80's and when this ascii art came out it was the greatest thing ever! So glad it's still going Looks great!
This retro phase brings warmth 2 many. I reflect on origins in my decades ago main frame days.
I love this! All of these old retro 80s early 90s posts tonight are so much fun! One of my favorite nights on TD!!! Keep it up!!!
Love it! God bless!
Quality art. That took a lot of coffee to make
Not really, you just pass an image through a filter. It’s like three clicks of effort.
Still cool though.
Wow this brings me back to pics they would print out like this in the 70s on the boardwalk!
This site is by far the best thing to happen in 2020... so far.
Trying to get a friend on here tonight i repeatedly said td.win is the happiest place on the internet.
Bigly
OrangeBlack Man Good!That’s epic
LOVE how the jacket is mostly made of "Q"s
If only I still had access to a line printer.
This really makes me miss having dot matrix printers. We should make those great again as well.
Is this like the 7 minute abs competition from There's Something About Mary?
Done; it's named DJT2020.
EDIT: extra updoot for username 👌
Link is https://pastebin.com/BvNcYPYh so you don't have to log in.
We’ve got the biggest endians, folks!
When I started with computers, ASCII art was the only way to produce an image. Seriously.
Y'all are making me feel old! But I love it!!!
Q confirmed. I knew it!
Not just a coincidence that the base of his head is " Q "
Do I see a BQQQQM in the right eye you sly dog
Seems like a waste that "bill clinton is a rapist. Www.infowars.com" wasn't hidden in there somewhere. Or it was a aaste for me to spend so long looking for it.
Epic.
Line printer art. Loved that stuff. We had a pair of CDC-580 series line printers. One printer was uppercase only and one mixed-case. At the peak of our paper usage we did just short of a million lines on the mixed-case printer and a million point 6 on the uppercase only printer (more repeat characters in the print train). This was in a 24-hour period. Those things were brutes.
(This was all back in mainframe days before personal computers were at all common.)
We used to crank out this kind of stuff on green-and-white tractor feed paper froms using the Burroughs B6800 mainframe that ran our university network. Also had a great multi-user Star Trek game we got play when no one was on the network.
We were genuinely excited when we got the B6800 upgraded to 512KB of RAM.
The further away you look it the better
BASEd64
Oh FUCK yes...