[0] https://www.bartlettpublishing.com/site/books/learn-to-progr...
[1] https://download-mirror.savannah.gnu.org/releases/pgubook/Pr...
This sounds like another one of those common "learn Asm by acting like a compiler" articles, which IMHO completely misses one of the best reasons to learn Asm: you can beat the compiler on size (relatively easy), speed (often harder), or both, precisely by not acting like one. I suspect the author, like so many others, also learned from only reading (some) compiler output. The complete lack of any use of static initialised data is shocking.
mov rdi, rdi
lea rsi, [rsp]
Please don't do this. Even a compiler can do better at O0.Stripped and OMAGIC (--omagic linker flag, from the man page: Set the text and data sections to be readable and writable. Also, do not page-align the data segment): 1776 bytes (1 KiB)
Besides being a very notable date (was that deliberate?), 1776 is closer to 2k than 1k. I suspect if you wrote it in C with inline Asm for the syscalls, it wouldn't be much bigger (and may even be a little smaller.)
If you want to see what Asm can really do, the sub-1k categories in the demoscene are well worth looking at.
These are pretty non-specific, but these are area I know about already for others who may have the same question as me:
1. Compiler development
2. Security research (malware analysis/reverse engineering) - although not much if any writing assembly, just reading
3. Kernel development - again mostly just reading assembly, not writing it. Bulk of code written in C (or potentially a very recent development, rust)
4. Driver development - mostly C but some devices can involve assembly
Why is Linux singled out there? No OS can use rcx for that, since the syscall instruction itself overwrites rcx with the return address.
The app is an X11 client and will run under an OS, meaning you’ll learn to make system calls and other library calls to get things on the screen. Very educational, and not scary-deep.
This is not correct. Only Linux has a stable kernel-userspace interface which allows you to depend on these numbers. In pretty much every other operating system, you are required to use the system libraries they provide. Compiling a program with these numbers hardcoded into them will cause them to break when the OS developers change the syscalls.
I wrote an article about this with more details and lots of citations:
Should probably be something like "Writing a Linux X11 application in assembly".
Is that true? I remember ~20 years ago I was looking at the i386 syscall ABIs (since amd64 wasn't big then), and there, Linux syscalls passed arguments by register and FreeBSD passed them on the stack. Maybe for amd64, FreeBSD switched to pass by register on Intel, but I wouldn't assume a syscall ABI is such a quick and simple substitution.
* I would like to understand the assembly used for exception handling. Does anybody know how exceptions work at an assembly level? (I am interested in algebraic effects)
* Need to create a closure in assembly.
* I have some assembly ported to GNU assembly based on a blog post whose website is down that executes coroutines.
Steve Yegge worked there and tells an interesting story. 15 million lines of hand-written x86 assembly!
http://steve-yegge.blogspot.com/2008/05/dynamic-languages-st...
"OK: I went to the University of Washington and [then] I got hired by this company called Geoworks, doing assembly-language programming, and I did it for five years. To us, the Geoworkers, we wrote a whole operating system, the libraries, drivers, apps, you know: a desktop operating system in assembly. 8086 assembly! It wasn't even good assembly! We had four registers! [Plus the] si [register] if you counted, you know, if you counted 386, right? It was horrible.
"I mean, actually we kind of liked it. It was Object-Oriented Assembly. It's amazing what you can talk yourself into liking, which is the real irony of all this. And to us, C++ was the ultimate in Roman decadence. I mean, it was equivalent to going and vomiting so you could eat more. They had IF! We had jump CX zero! Right? They had "Objects". Well we did too, but I mean they had syntax for it, right? I mean it was all just such weeniness. And we knew that we could outperform any compiler out there because at the time, we could!
"So what happened? Well, they went bankrupt. Why? Now I'm probably disagreeing – I know for a fact that I'm disagreeing with every Geoworker out there. I'm the only one that holds this belief. But it's because we wrote fifteen million lines of 8086 assembly language. We had really good tools, world class tools: trust me, you need 'em. But at some point, man...
"The problem is, picture an ant walking across your garage floor, trying to make a straight line of it. It ain't gonna make a straight line. And you know this because you have perspective. You can see the ant walking around, going hee hee hee, look at him locally optimize for that rock, and now he's going off this way, right?
"This is what we were, when we were writing this giant assembly-language system. Because what happened was, Microsoft eventually released a platform for mobile devices that was much faster than ours. OK? And I started going in with my debugger, going, what? What is up with this? This rendering is just really slow, it's like sluggish, you know. And I went in and found out that some title bar was getting rendered 140 times every time you refreshed the screen. It wasn't just the title bar. Everything was getting called multiple times.
"Because we couldn't see how the system worked anymore!"
...I have to say, the "140 redraws by accident" part sounds like an ordinary day in web UI development using 2023 frameworks. The problem of not seeing the entire picture of what's going on isn't limited to assembly programmers. You can start from the opposite end of the abstraction spectrum and end up with the same issues.
Both were for 32 bit assembly, not 64 bit, IIRC.
Paul Carter was a professor or lecturer at a US college.
I think his book was available online.
This takes me back about 30 years as a youngster discovering the magic ASM incantation to efficiently draw to the screen in DOS mode 0x13.
Or even less recently...whoever wrote the first Rust, Zig, or insert <new compiled language> here?
Because don't you ultimately have to know how to make your own syntax translate into efficient assembly code?
Or is there someway these days for programming language designers/creators to avoid it entirely?
...
cmp BYTE [rsp], 1
jnz die
How can I diagnose the issue? The article didn't dive into the matter of reading error codes.
> just 600 lines of code
It wasn't until I created a sinple 8086 emulator where you take the raw machine code instructions and translate those into not only the assembly instructions but actually emulate what those instructions do that I finally felt like I REALLY knew assembly.
My suggestion to others that want to learn assembly is to skip any assembly books. Using whatever language you want first start with a translator from machine code into assembly instructions, and then do an emulator. You only need to implement a small subset of the instructions, check out godbolt and translate some simple programs to know which instructions you need to implement.
Other than that all you really need is the 8086 manual, it has all the information there. I also found this site useful when implementing the flags https://yassinebridi.github.io/asm-docs/8086_instruction_set.... This takes less time than finishing a book and you learn a LOT more.
The goal is not to program in assembly at all but to truely understand the cost of everything and what you can expect from your hardware.