Re-inventing the wheel with programming binary from scratch

Page 3 of 3 [ 41 posts ]  Go to page Previous  1, 2, 3

eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,721
Location: In the heart of the dust bowl

07 Apr 2016, 6:46 pm

By the way, on the PDP-11, this is all in octal, not hexadecimal.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,721
Location: In the heart of the dust bowl

07 Apr 2016, 6:47 pm

So what advantage would there be to use a binary editor.to insert the machine language in directly? None at all. It would be a huge pain in the neck.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,721
Location: In the heart of the dust bowl

07 Apr 2016, 6:47 pm

Furthermore, if you want to go back to it years later, trying to read it from the machine language would not be worth the trouble. I can still read PDP-11 and VAX-11 assembly language code I wrote many years ago, but I couldn't begin to read the machine language without an extended refresher.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,721
Location: In the heart of the dust bowl

07 Apr 2016, 6:47 pm

And if you wanted to modify the code to make it work on another processor, yThere is no going lower except to study the CPU itself. One could use a logic analyzer on a CPU to examine he various signals.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,721
Location: In the heart of the dust bowl

07 Apr 2016, 6:48 pm

Also, one could design a chip to perform certain calculations in hardware. For example, one could build a chip to do encryption and decryption on a chip instead of from a program. If someone is interested in that, then they need to get into electrical engineering.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,721
Location: In the heart of the dust bowl

07 Apr 2016, 6:48 pm

But as far as programming a general purpose computer, the lowest you are going to be able to do is machine language and assembly language. Remember that an assembly language is essentially a representation of the machine language in a form that one can read.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,721
Location: In the heart of the dust bowl

07 Apr 2016, 6:48 pm

And if you wanted to modify the code to make it work on another processor, you could conceivably do that with assembly language, but it might not be very easy. For example, Intel processors don't have the same idea of general purpose registers. Doing that with machine language would be really tough.ou could conceivably do that with assembly language, but it might not be very easy. For example, Intel processors don't have the same idea of general purpose registers. Doing that with machine language would be really tough.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,721
Location: In the heart of the dust bowl

07 Apr 2016, 6:49 pm

By the way, regarding Steve Gibson, I don't agree with some of his ideas about security at all. The notion of trying to close your ports so that attackers can't find them is just plain stupid. Forget all his nonsense about stealth and you'll be much smarter.

On the other hand, it won't hurt to pay attention to his ideas about assembly language.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,721
Location: In the heart of the dust bowl

07 Apr 2016, 6:50 pm

Cloudfare sucks. I see no reason to put up with that s**t.

I'm outta here. Will check again after a while to see if it is still in use. If it is, then it will be time to say goodbye, possibly forever.



eric76
Veteran
Veteran

User avatar

Joined: 31 Aug 2012
Gender: Male
Posts: 10,721
Location: In the heart of the dust bowl

07 Apr 2016, 9:57 pm

Back for a moment with a bit more to add.

On the VAX computers, there was a level between the cpu and the assembly/machine language called microcode. You used microcode to define the assembly/machine language.

You couldn't write programs with the microcode, but you could conceivably create a single assembly/machine language instruction that did a lot of processing.

For example, with microcode you could create an assembly language instruction that would compute a Cyclic Redundancy Check on a block of data. I don't remember clearly, but I think that DEC may have had such an instruction on the VAX computers.



Edenthiel
Veteran
Veteran

User avatar

Joined: 12 Sep 2014
Age: 51
Gender: Female
Posts: 2,819
Location: S.F Bay Area

08 Apr 2016, 1:31 am

eric76 wrote:
Back for a moment with a bit more to add.

On the VAX computers, there was a level between the cpu and the assembly/machine language called microcode. You used microcode to define the assembly/machine language.

You couldn't write programs with the microcode, but you could conceivably create a single assembly/machine language instruction that did a lot of processing.

For example, with microcode you could create an assembly language instruction that would compute a Cyclic Redundancy Check on a block of data. I don't remember clearly, but I think that DEC may have had such an instruction on the VAX computers.


Microcode is still very much around - it's the 'cpu firmware' of x86 and x64 intel & amd processors. Complex functions and bug fixes, mostly. I'd also like to say that Arduinos use microcode, but I'm not sure people aren't just making a portmanteau of microCPU and programming code. I'll have to dig deeper...it's certainly not the high level stuff most are referring to when they use the term with arduinos. PIC processors, yes.


_________________
“For small creatures such as we the vastness is bearable only through love.”
―Carl Sagan