Notices by Trade Minister Tagomi (trademinister@freespeechextremist.com), page 3
-
@p
They wouldn't play. Tried every browser.
-
@p
> Ogg files, sometimes they don't work properly on phones. Let me try mp3:
bb00ff.mp3
bb0109.mp3
Maybe it's just my phone. It has a music player, but it's insisting on bringing everything up as an FSE website, with an embedded player, which doesn't work.
-
> I hate phones so much.
Me too, and I almost never use anything else.
Because... Soooo... Convenient.....
-
@p
@TradeMinister
> Even 64-bit addresses would have allowed for a publicly routeable address for more than the number of chips that have been manufactured or that *can* be manufactured once we've scraped every grain of copper from the earth: it allows for 2.4 b-b-b-billion addresses per living human. 64-bit addresses are excessive, but I could understand it. 128-bit addresses (4.2e28 addresses per human, so each human could assign 20,000 addresses to every star in the known universe) go way past the laughable "and we won't have to ever fix anything again because this time the protocol is perfect!" approach and into the realm of parody.
Yeah, 64 bits sounds fine, and might even be human-readable.
> Not just the C++ mindset, but now also C++ programmers: a couple of years ago they snuck C++ into gcc's source code, so the C compiler no longer written in C.
Hard to imagine rms allowing this. I guess he must have handed gcc to a subsequent generation.
> > The real thrill was microcode
> I should correct it to "Chads *implement* Ring 0 for kernel programmers and other mortals."
Machines, at least the minis I knew about and the one I worked on, were simpler then. In one instruction, you might simultaneously tell the ALU to add, start a memory read or write, tell the shift-rotate register to do something, but there was none of this speculative execution stuff which has turned out to be problematic.
Writing instruction emulation ucode for modern chips must be really hard compared to what we did back then.
-
@p
@TradeMinister
> > Yeah, 64 bits sounds fine, and might even be human-readable.
> Not super readable, but readable enough, and fits in a register, and more addresses than Earth will need.
Butbutbut space! And galaxies and shit! Ancient aliens!
> > Hard to imagine rms allowing this. I guess he must have handed gcc to a subsequent generation.
> I'm not privy.
Me either. But I did sort of know him a little bit (you probably would have too, if you were in my location). And I got the sense that he was a pretty old-skool guy in terms of K&R V7 etc design philosophy: human-readable, minimalist. While he did invent Emacs Lisp and lived in the MIT AI lab, I can't imagine him inventing the C++ abortion. I don't know if he was involved in ObjC, but I think it would be more to his liking, and Apple (much as I love to hate them) made a top-notch choice in adopting it instead of C++.
I'd be shocked and saddened to hear he blessed moving gcc to c++, or allowing 'optimizations' like transforming printf() to puts().
> Yeah; I think overall, load/store was a good thing.
On the BBN C machine, it was interesting: start load/store, do other things for 3 cycles, data now available.
> I think, for the complication, a better memory bus would have been a wiser investment than the mess of a pipeline we have now.
Exactly. Busses are key. I believe one of the key parts of the Xen design is something they call like a 'fabric' or something, anyways a chip-internal very intelligent bus or network of busses.
It's not for nothing they used to call it data processing: unless one is doing AI or heavy number-crunching, one is usually plowing thru lots of data. The superfast whizzo chips with their pipelines and hyperthreading and such probably still spend lots of their time idling waiting for all... those... nanoseconds to pass before memory access complete. And that's not even thinking about contention between cores, locking, etc.
I used to overclock a bit, and found that boosting one's cpu frequency a lot mattered much less than boosting the bus speed a little, except for CPU benchmarks.
-
@p
Speaking of which, I was well into a long response, got to your 'vampire' and decided to see what 'colon v' (I dare not type it) would get me, and Husky just trashed the response. I'll try again later.
-
Hey @p , last I knew, you were working on something big, code-wise, that you were getting close to releasing. I wasn't clear what it was, but it sounded interesting
-
@p
I quite enjoyed the original TCP/IP RFCs. Back in the day, if you were on ARPAnet, DARPA would just send them to you. I may still have copies somewhere. Anyways, I considered them excellent examples of how to design and document a protocol (and X.25 a perfect example of how not to).
Revolver is first and foremost a protocol; the implementations are secondary. In practice, the two will evolve together. I'd enjoy seeing your protocol docs when you get around to writing them.
-
@p X.25 was a great fantasy protocol, clearly written by Eurocrat philisopers. They actually tasked me, by myself, with implementing it! I hadn't thought about it, but recently realized that this was nuts: I didn't even do network code, Steve Dyer handled the TCP/IP stack and I did the rest. (I just read the RFCs for pleasure: it was like a well-designed machine architecture, which I also read for aesthetic enjoyment).
Fortunately I had already decided to leave, and just stalled them while I worked on leaving the kernel and ucode in the best state I could.
Strictly blue sky: a good and maybe easy way to test both code and protocol might be to have VMs jabbering at each other. Eventually you'd have to simulate bad actors trying to meddle, but that would be down the road.
TCP/IP was written to survive a nuclear war; it was a pleasing model of pragmatism and efficiency.
It sounds like what you are working on is to ruggedize the Fedi protocol against cyber warfare; the RFC will be interesting!
-
@p
I haven't read IPv6, but 30 years of daydreaming doesn't sound good. The people who wrote IPv4 and TCP were very good, but pragmatic, practical, "what works and will keep working" engineers, I think. Some were probably at BBN, but 5 or 10 years before me.
MIT guys, I would guess. That's how it read to me, anyway.
BBN supported some dreamers, like a guy who AFAIK worked on nothing but a C interpreter the whole time I was there and AFAIK never got it working, but he was probably up in the Tower, where they did actual research. He was probably a Harvard guy. My wing was just engineers, more MIT types.
-
@p
@TradeMinister
> Your critiques of IPv6, should you get around to reading it, are probably better informed than mine are.
Probably not; I was never a TCP/IP stack guy.
> There's some nice stuff in there, but there's also some stuff that I wish they hadn't, some stuff that they speculated they'd want in the 1990s and then expanded on over the years without any negative feedback. (The easy target is the address notation.)
Sounds like C++; maybe that kind of bloat was in the air then.
> People have this vague notion that a C interpreter ends up being useful for a lot of things. I think it seems like a very large amount of trouble and I can't identify the payoff. Apparently CERN had one and there was one knocking around Bell Labs, but I can't find it.
If one wants a C interpreter, play with Javascript instead. Otherwise, a VM and a debugger. I can't say I ever really saw the point, but I was almost never in userspace anyway.
Guy doing it was this weird hairy little guy, maybe Dan something. Maybe he went off to CERN or Bell Labs and got it working.
> > My wing was just engineers
> Idyllic.
It really was. Best job I ever had, except for being a river guide, but it paid a lot better. Doing that 9 months of the year, then guiding in Summer: that would be hard to beat. Unless there's a surf-guiding gig. That might be better.
-
@p
@TradeMinister
>> Sounds like C++; maybe that kind of bloat was in the air then.
> One of the objections to RFC 1597 that were laid out in RFC 1627, "Network 10 Considered Harmful", was that private networks would make it take longer to exhaust the IPv4 address space, which was about to be exhausted any minute now (where "now" meant "1994"), so the sky has been falling since IPv6 was called "IPng", and the IPv6 guys were trying to make the sky fall faster so that the perfect system could replace the good one.
Nice. Perfect... Good... Sounds familiar somehow...;)
Well, one nice thing about ipv4 is that a human can remember and write 127.0.0.1, whereas ipv6 addresses look machine readable only. I suppose it is necessary for one's refrigerator and microwave and desk clock to have their own ip addresses... Or maybe none of that shit belongs on the Net, and fuck the IoT.
> Otherwise, a VM and a debugger.
Yeah, that was my suspicion, more or less: it's easier to simulate a CPU than to interpret C, and without a lot of benefit: you've still got to wander through all of the files, parse them, etc. That was the slowest part of the compiler until the gcc team ran out of stuff to do and decided they could "optimize" by making 30 passes that squeeze out almost nothing and does dumb shit like rewriting printf to puts (seriously, they do this, and they mangle the string constant in the binary and this turns one syscall into two, which is more pessimization than optimization).
I heard about that bs 'optimization'. Fscking c++ mindset.
I wonder if anyone ever made a register-transfer-language emulator. gcc first compiles to RTL, which looks a lot like an assembly language, then does its optimizations on the RTL. Seems like if one could single-step an RTL machine, not optimize, and have comments in the RTL linking back to the code line by line, one could have something like emacs+gcc 'playing' the code, and showing changes to vars as they happen. Of course, emulating some of the sketchy stuff one does systems coding, like say hand-building a thread and feeding it to a scheduler, emulating that could get interesting. But normal, user-mode C could probably be emulated.
> I can't say I ever really saw the point, but I was almost never in userspace anyway.
:craylol: Chads live in Ring 0. :chuckmoore2:
;) The real thrill was microcode: everything was mission-critical, and if something goes wrong, the machine either freezes or starts producing random values. No debugging other than reading source. Definitely a young man's sport. One really had to be able to single-step the (parallel) hardware in one's head. And modern architectures with branch-prediction and such... Maybe they have emulators LOL.
I'll look at the technical stuff when I feel smarter, meaning after coffee.
-
@dielan @LeroyBrown8 @Moon @jeffcliff
Over on Twitter, no one seems to know what the Fedi is, and thinks Mastodon is a site for Leftists like 'Tribal'.
If Musk was serious about free speech, he'd make Twitter a fedi instance.
Statistics
- User ID
- 2949
- Member since
- 21 Dec 2022
- Notices
- 53
- Daily average
- 0