Re-inventing the wheel with programming binary from scratch

Page 2 of 3 [ 41 posts ]  Go to page Previous  1, 2, 3  Next

eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

04 Apr 2016, 10:11 pm

On the old Data General Supernova computer I used to have to enter a bootstrap program a word at a time by flipping switches to set bits. That wasn't programming.

Sure, you can write programs directly into a binary/octal/hexadecimal representation -- that's called machine language, not binary.



The person who developed the boot loader may have written it in machine language or he may have written it in assembly language and then converted that to machine language. I suspect the latter.

If someone had told him he was writing in binary, he would have thought that they were idiots.

If I was interviewing someone for a job and they said that they could write code in binary, I might ask them what they meant or I might just regard them as clueless. In either case, they would not get the job. If they told me that they wrote code in machine language for anything but a few lines, I wouldn't believe them and would not hire them. It is either assembly language or assembler language.

When I had to toggle the bootstrap on the Supernova, I was not programming, at least not in a computer professional's use of the term. Once I had toggled in the bootstrap and started it running, the machine would read my program from the paper tape (created from a deck of cards by an assembler on a Nova) and execute it.

And what it shows is how to produce the machine language from the assembly language. That can be useful because the output of the assembler will often produce the generated machine instructions in either octal or hexadecimal depending on the machine. I've never seen one list the machine language in binary.



Last edited by eric76 on 04 Apr 2016, 10:27 pm, edited 2 times in total.

Edenthiel
Veteran
Veteran

User avatar

Joined: 12 Sep 2014
Age: 56
Gender: Female
Posts: 2,820
Location: S.F Bay Area

04 Apr 2016, 10:18 pm

Hi Erik - I'm assuming that thanks to Cloudflare all that came through was the quoted part, not your typing (cloudflare eats comments).


_________________
“For small creatures such as we the vastness is bearable only through love.”
―Carl Sagan


eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

04 Apr 2016, 10:26 pm

Edenthiel wrote:
Hi Erik - I'm assuming that thanks to Cloudflare all that came through was the quoted part, not your typing (cloudflare eats comments).


Yeah. I had to go back and enter my comments. When it refused to take it again, I removed the quotes and it seems to have gone through okay.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

04 Apr 2016, 10:41 pm

Even if you were truly gifted and could write machine code directly, there are several reasons to use assembly language.

First, to make it readable and enable others to modify the code.

Second, if you had to remove or add an instruction, you would have to go through it and recalculate every offset, branch, jump, ... .

That said, I did on one occasion directly modify an executable that I didn't have the source code for. A company where I once worked had a fully licensed copy of some medical billing software on a PDP-11, but the company owning the code did not trust their customers at all and would only authorize the code for three months at a time. So every three months, they'd have to dial into the machine and patch the software. That got old pretty quick and so I dug through the executable and figured out the wacky date format they used to disguise the date and where the date was stored in the executable and manually changed that date to a time a couple of decades in the future. After that, we no longer had it quit working and have to be reauthorized. In this case, however, there was no need to recalculate any offsets or locations in the code.



Edenthiel
Veteran
Veteran

User avatar

Joined: 12 Sep 2014
Age: 56
Gender: Female
Posts: 2,820
Location: S.F Bay Area

04 Apr 2016, 11:42 pm

Oh, I agree completely that in a practical sense there is no reason to actually mess with individual bits, and generally speaking the higher the language the more you can get done in the same amount of time/effort. Although sometimes a balance needs to be struck too, as in RTOS's or other times when extra layers are too inefficient.

But for someone just starting out (or someone who is interested in processor design), the state of individual bits can be enlightening, no?

Nice PDP-9 story! I recently visited the Computer History Museum again w/my child and saw their PDP-1, along with just about every other milestone computer up to the IBM PC era including a full IBM 1401 room. A wee bit before my time, the closest I've come to working with the PDP series was the Alpha Micro, which apparently 'borrowed' much of it's OS code from DEC's PDP-11.


_________________
“For small creatures such as we the vastness is bearable only through love.”
―Carl Sagan


Chichikov
Veteran
Veteran

User avatar

Joined: 27 Mar 2016
Age: 50
Gender: Male
Posts: 1,151
Location: UK

05 Apr 2016, 5:22 am

EphraimB wrote:
I want to learn binary from scratch (example: learn how to program drivers, display pixels on the screen, and make an Operating System in binary from scratch). I want to be able to program from the lowest level possible (which everybody did in the 1970’s). I don’t like relying on other people’s code at all. I feel like me re-inventing the wheel will open up new possibilities for me in technology. I want to actually change the world. I don’t want to be known as a simple “typical programmer” just like all other programmers. I want to be famous for making real new evolutions in technology.

Is there any place I can learn this?


Don't have much to add beyond what's already been posted. If you already have a PC then maybe the first thing I'd do is learn assembler on the PC. Yeah it's going to be more complicated than it is for 8-bits etc, but at least the tools will be modern and the reference materials will be easier to find. If you really want to get low level something like a Pi etc is probably going to be your best bet, primarily for the reasons above. You could also try and get an old 8-bit machine like an Amstrad CPC but my concern there would be getting the right tools and getting the reference manuals and finding people who know about those old systems. If you can't get a physical machine you could try an emulator.

But yeah, that would be my path...I'd give it a go on PC to see if you can even get your head around it and the concepts, even if it is like running before you can walk, and once you've got the basics then you can look to non-PC related things to really get into it. Or if you're looking to write things like drivers etc then stick with the PC and just see how it goes.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

05 Apr 2016, 7:36 pm

Edenthiel wrote:
But for someone just starting out (or someone who is interested in processor design), the state of individual bits can be enlightening, no?


Learning the assembly language for one or more processors is quite enlightening. Being able to convert an assembly language statement to machine language just goes along with learning assembly language. Programming in assembly language is useful, but programming in machine language is very rarely needed. If you need to do that, you would probably be better off if you just used a disassembler (I used to have disassemblers I wrote for the PDP-11 and the IBM 360/370) to convert the machine language back to assembly language.

Remember that there is a one to one correspondence between assembly language and machine language (not counting macros). There would be no advantage whatsoever from trying to learn to use machine language directly instead of assembly language. If anything, it would seriously hamper the learning curve. About all it would likely do is discourage the user.

But assembly language is very interesting. And if you are good at details, it is a joy to learn.

As far as being faster than other languages, with today's modern compilers it can sometimes be very difficult to write assembly language code by hand that is more efficient than that produced by modern compilers.

That said, have you ever heard of Steve Gibson? He writes (or at least wrote) programs for Microsoft Windows that use assembly language with extremely good results. See https://www.grc.com/smgassembly.htm.



Edenthiel
Veteran
Veteran

User avatar

Joined: 12 Sep 2014
Age: 56
Gender: Female
Posts: 2,820
Location: S.F Bay Area

07 Apr 2016, 1:01 pm

eric76 wrote:
Edenthiel wrote:
But for someone just starting out (or someone who is interested in processor design), the state of individual bits can be enlightening, no?


Learning the assembly language for one or more processors is quite enlightening. Being able to convert an assembly language statement to machine language just goes along with learning assembly language. Programming in assembly language is useful, but programming in machine language is very rarely needed. If you need to do that, you would probably be better off if you just used a disassembler (I used to have disassemblers I wrote for the PDP-11 and the IBM 360/370) to convert the machine language back to assembly language.

Remember that there is a one to one correspondence between assembly language and machine language (not counting macros). There would be no advantage whatsoever from trying to learn to use machine language directly instead of assembly language. If anything, it would seriously hamper the learning curve. About all it would likely do is discourage the user.

But assembly language is very interesting. And if you are good at details, it is a joy to learn.

As far as being faster than other languages, with today's modern compilers it can sometimes be very difficult to write assembly language code by hand that is more efficient than that produced by modern compilers.

That said, have you ever heard of Steve Gibson? He writes (or at least wrote) programs for Microsoft Windows that use assembly language with extremely good results. See https://www.grc.com/smgassembly.htm.


All quite true. I was recommending going even lower just as a baseline lesson for someone who seemed to want to know that part. Insofar as Steve Gibson goes...well, he's a bit of a... controversial figure among security geeks.


_________________
“For small creatures such as we the vastness is bearable only through love.”
―Carl Sagan


eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

07 Apr 2016, 6:42 pm

Cloudfare really sucks. I'm not even quoting anything and it's giving me hell about trying to post this. Why does anyone in their right mind use that stinking service?



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

07 Apr 2016, 6:44 pm

Will tryone paragraph at a time.

There is no going lower except to study the CPU itself. One could use a logic analyzer on a CPU to examine he various signals.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

07 Apr 2016, 6:44 pm

Also, one could design a chip to perform certain calculations in hardware. For example, one could build a chip to do encryption and decryption on a chip instead of from a program. If someone is interested in that, then they need to get into electrical engineering.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

07 Apr 2016, 6:44 pm

But as far as programming a general purpose computer, the lowest you are going to be able to do is machine language and assembly language. Remember that an assembly language is essentially a representation of the machine language in a form that one can read.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

07 Apr 2016, 6:45 pm

For example, on the PDP-11, the MOV instruction will move a word from one place to another. For example, to move the contents of register R3 to register R4, one could do "MOV R3,R4". Or R3 if R3 contains the address of the word you want to move from and R4 contains the address where you want to move that word you could do "MOV (R3),(R4)".



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

07 Apr 2016, 6:46 pm

Cloudfare won't let me post this paragraph. Paragraph deleted. Use your imagination what it might have said. Note that it had something to do with the previous paragraph.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

07 Apr 2016, 6:46 pm

By the way, on the PDP-11, this is all in octal, not hexadecimal.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,660
Location: In the heart of the dust bowl

07 Apr 2016, 6:47 pm

So what advantage would there be to use a binary editor.to insert the machine language in directly? None at all. It would be a huge pain in the neck.