Page 2 of 3 [ 39 posts ]  Go to page Previous  1, 2, 3  Next

MaxE
Veteran
Veteran

User avatar

Joined: 2 Sep 2013
Gender: Male
Posts: 5,264
Location: Mid-Atlantic US

10 Nov 2022, 7:25 am

klanka wrote:
interesting read
"BTW C++ was by far the most challenging language I've ever programmed in."
because of no garbage collection ?

Memory management was a big issue, but other things like constness and how templates work, which goes along with how each OS had its own "flavor" of the language. C++ had very much the feeling of a language invented in Europe. To explain, it's my impression that programmers in Europe spend way more time at University than programmers elsewhere and tend to see everything as an academic exercise, which can get in the way of achieving a practical result. No disrespect meant for Linus Torvalds though. His academic background may have been a plus in developing Linux because a built from the ground up OS needs mathematical rigor.


_________________
My WP story


Fenn
Veteran
Veteran

User avatar

Joined: 1 Sep 2014
Gender: Male
Posts: 2,456
Location: Pennsylvania

10 Nov 2022, 10:43 am

C++ was an attempt to extend C with OO to make it like Smalltalk.
C lang was originally created to make portable code for writing OS. So C++ combines the buffer overruns and wild pointers of C with the complexity of Smalltalk which was supposed to be “small and expressive”. The fellow who came up with Smalltalk was a polymath who had a degree in biology, and was working for Xerox PARC. He wanted each “object” to be like a mini computer on a network with other computers, or like cells or “monads” which were a bit like Plato’s Forms. The whole C++ template thing has to do with meta programming and type safe-ness. It is kind of the polar opposite to Lisp. In Lisp everything is either an Atom or a List. Including the code. All the rest is Structure and Functions/Lambdas. My head loves Lisp and not so much OO languages.


_________________
ADHD-I(diagnosed) ASD-HF(diagnosed)
RDOS scores - Aspie score 131/200 - neurotypical score 69/200 - very likely Aspie


techstepgenr8tion
Veteran
Veteran

User avatar

Joined: 6 Feb 2005
Age: 44
Gender: Male
Posts: 24,182
Location: 28th Path of Tzaddi

10 Nov 2022, 1:20 pm

Fenn wrote:
C++ was an attempt to extend C with OO to make it like Smalltalk.
C lang was originally created to make portable code for writing OS. So C++ combines the buffer overruns and wild pointers of C with the complexity of Smalltalk which was supposed to be “small and expressive”. The fellow who came up with Smalltalk was a polymath who had a degree in biology, and was working for Xerox PARC. He wanted each “object” to be like a mini computer on a network with other computers, or like cells or “monads” which were a bit like Plato’s Forms. The whole C++ template thing has to do with meta programming and type safe-ness. It is kind of the polar opposite to Lisp. In Lisp everything is either an Atom or a List. Including the code. All the rest is Structure and Functions/Lambdas. My head loves Lisp and not so much OO languages.

Out of curiosity, with the extent of your studies, what sorts of languages do you think will be the future of programming? Also, what's your level of concern as to whether AI will be writing most of the code by the early 2030's? On the later point, thinking of my own clients, I can't imagine these people adequately giving even an AI high-level instructions and getting what they want because for them their company's business processes are a subconscious thing that they don't really know off the top of their head and I'd have to imagine they'd spend most of their time yelling at consultants if they aren't yelling at the AI directly.


_________________
“Love takes off the masks that we fear we cannot live without and know we cannot live within. I use the word "love" here not merely in the personal sense but as a state of being, or a state of grace - not in the infantile American sense of being made happy but in the tough and universal sense of quest and daring and growth.” - James Baldwin


Radish
Veteran
Veteran

User avatar

Joined: 10 May 2022
Age: 64
Gender: Male
Posts: 13,233
Location: UK

10 Nov 2022, 1:26 pm

techstepgenr8tion wrote:
I can't imagine these people adequately giving even an AI high-level instructions and getting what they want.


Indeed. I always found the most tricky part of software development doing the analysis phase. Often the managers and directors who I dealt with and who wanted to specify the software weren't aware of exactly what happened in their business at the sharp end. It was only by having long conversations with the future end users that all the complexities of their business and business processes came to light. So many exceptions to the norm that had to be coded for, or you ended up with lots of ongoing project creep or additions and changes after software delivery.

I do think AI will get there, eventually, but it may be decades away yet before they can produce functioning code that does what the end users and business wants and needs.


_________________
This space intentionally left blank.


Fenn
Veteran
Veteran

User avatar

Joined: 1 Sep 2014
Gender: Male
Posts: 2,456
Location: Pennsylvania

10 Nov 2022, 1:37 pm

Some discussion of encoding puzzles and solutions here:

Computers, Math, Science, and Technology - Sudoku And Related Puzzles


_________________
ADHD-I(diagnosed) ASD-HF(diagnosed)
RDOS scores - Aspie score 131/200 - neurotypical score 69/200 - very likely Aspie


Fenn
Veteran
Veteran

User avatar

Joined: 1 Sep 2014
Gender: Male
Posts: 2,456
Location: Pennsylvania

10 Nov 2022, 2:10 pm

techstepgenr8tion wrote:
Out of curiosity, with the extent of your studies, what sorts of languages do you think will be the future of programming? Also, what's your level of concern as to whether AI will be writing most of the code by the early 2030's? On the later point, thinking of my own clients, I can't imagine these people adequately giving even an AI high-level instructions and getting what they want because for them their company's business processes are a subconscious thing that they don't really know off the top of their head and I'd have to imagine they'd spend most of their time yelling at consultants if they aren't yelling at the AI directly.


Two things might significantly affect programming: one is Siri (and Siri work-alikes) and the other is something like MIT Scratch (and similar graphical languages).

Python is also very popular for AI right now. The main driver seems to be "easy to learn" - so non-programmers tend to pick it up fast - as in mathematicians or physicists or biochemists.

Scratch can be used to graphically and interactively code, and can also be converted to something like Java or C++ automatically. So kids can learn it and continue to ramp up from there. If my 12-year old son is anything to judge by, lowering the barrier to entry is important and "getting results fast" is important too. So is graphical. And Scratch is more graphical than most IDEs. Scratch objects can be created on screen but are saved as XML. The XML can then be post processed. Berkley Snap! is also interesting. To get mind-share a language has to be powerful enough to be used in "real projects" but also attractive to new young programmers. Scratch is based on Squeak which is based on Smalltalk-80. Older version used Java and Flash, but that is gone now.

Siri and voice recognition and natural language processing also allow non-programmers to interact with computers. It is part of the trend to allow non-programmers to do things that used to require a programmer. Replacing programmers entirely has long been a goal. It was this idea that made Cobol so verbose. More and more I am seeing jobs and tasks that used to require a programmer get replaced with point and click interfaces or some kind of configuration utility. Combine that idea with smarter and smarter voice rec and NL and the two ideas will eventually converge. Kind of like a self-driving-car for code. I don't think this will happen in the next 10 years but I expect to see it as a trend that has already started and will continue. The future of programming languages is no programming language. One thing to note: some problems have an essential level of complexity so you just end up pushing the complexity from one place to another, never eliminating it.

Another language I like is golang. Unlike Scratch and Java and Python, it compiles down to machine code, so it can be self hosting. One of the things that makes Python great is the ability to use C and C++ to optimize it or extend it. If it is not fast enough or not able to interface with hardware or OS libraries you can drop down to C. It is also a curse. C has its own troubles. And ABI+Compiler-differences is always a pain. golang is high level and low level both.

If you are interested in Embedded (or robotics) - C/C++ is still a good way to go.

No-one can really tell the future. AI has had more than one "AI winter" already, there may be more ahead.


_________________
ADHD-I(diagnosed) ASD-HF(diagnosed)
RDOS scores - Aspie score 131/200 - neurotypical score 69/200 - very likely Aspie


Fenn
Veteran
Veteran

User avatar

Joined: 1 Sep 2014
Gender: Male
Posts: 2,456
Location: Pennsylvania

10 Nov 2022, 2:31 pm

links on the history of programming languages and Alan Kay

https://en.wikipedia.org/wiki/Alan_Kay
https://wiki.c2.com/?AlanKay

https://wiki.c2.com/?SmalltalkLanguage
https://wiki.c2.com/?CeeLanguage
https://wiki.c2.com/?CeePlusPlus
https://en.wikipedia.org/wiki/WikiWikiWeb
https://en.wikipedia.org/w/index.php?ti ... edirect=no

https://www.danieltorresblog.com/blog/t ... -beginning

https://en.wikipedia.org/wiki/AI_winter

https://en.wikipedia.org/wiki/Lisp_(pro ... g_language)


(Please note - I am one of those people who feels he knows nothing about a subject until he knows everything about the subject. I have gone on deep dives into the history of programming. Surely this is not for everyone. )


_________________
ADHD-I(diagnosed) ASD-HF(diagnosed)
RDOS scores - Aspie score 131/200 - neurotypical score 69/200 - very likely Aspie


Last edited by Fenn on 10 Nov 2022, 3:52 pm, edited 1 time in total.

techstepgenr8tion
Veteran
Veteran

User avatar

Joined: 6 Feb 2005
Age: 44
Gender: Male
Posts: 24,182
Location: 28th Path of Tzaddi

10 Nov 2022, 3:30 pm

Fenn wrote:
(Please note - I am one of those people who feels he knows nothing about a subject until he knows everything about the subject. I have gone on deep dives into the history of programming. Surely this not for everyone. )

I generally acknowledge that I don't have the cognitive bandwidth to know everything and hold it in memory so I try instead to keep the best indexes / tables of contents on as many topics as I can and hone my skills at spotting the experts in any area.

One of the things I've found interesting in the past few years is the concept of intellectual dark matter, ie. knowledge of how things work that's out there, often siloed within a given generation and sub-discipline, when we move beyond that point in terms of innovation and/or when that generation retires we forget earlier languages, structures, etc. and then run into problems or rediscover obstacles or solutions that we didn't realize we already knew about. Both Samo Burja and John Blow have talked about different aspects of this, John Blow talked about it specifically in the programming context in one of his lectures where he brought up failed chip designs that Intel came up with because the new engineers had theory but not experimental data that the previous engineers had used to build the chip the way they had done. He brought up the concerns about the deeper layers of the internet and software that's used around the world and brought up the concern as to whether enough people actually knew the logic and languages at the lower level. I remember being on a trip back in 2019 where I met a Russian guy who was telling me that more Cobol programmers were still needed to maintain old code for a lot of institutions and that it was actually a hot field.

Curious as to whether you think that's going to be a big problem or whether there are enough intrepid people out there already aware of the problem and willing to learn / willing to pay for that knowledge?


_________________
“Love takes off the masks that we fear we cannot live without and know we cannot live within. I use the word "love" here not merely in the personal sense but as a state of being, or a state of grace - not in the infantile American sense of being made happy but in the tough and universal sense of quest and daring and growth.” - James Baldwin


Fenn
Veteran
Veteran

User avatar

Joined: 1 Sep 2014
Gender: Male
Posts: 2,456
Location: Pennsylvania

10 Nov 2022, 4:33 pm

I was coding around the time of Y2K. There were a lot of clever solutions. One I liked was to take old code where you had the executable and no source code - dis-assemble it so now you have working assembly code, then use a parser to convert the assembly line-for-line to C. You can now patch the Y2K bugs with C, and you can also migrate to a new platform or a new OS that has no Y2K bugs. Something similar could work with Java or Golang as the target. The original code is ugly, but it can be refactored to better code over time. There are Cobol compilers (Micro Focus Cobol) that compile down to the JVM.
The other solution I used was the "boat anchor" approach. Simply develop a new solution from scratch - of by reversing the specification. A depressing experience I have had is watching new construction homes and buildings go up faster than a coding project I was on was completed. Once you are fully using the new system the old system may be used as a "boat anchor" (not needed, but a heavy piece of metal). Sometimes "all new code" is better than "fix the old code". It doesn't always work exactly the way the old code worked, but if it keeps the client happy - then it may not have to.


$98,539 per year on average in the USA for Cobol programmers (according to ziprecruiter)
Linkedin search for "cobol in United States" yealds 2,761 results (job listings).


_________________
ADHD-I(diagnosed) ASD-HF(diagnosed)
RDOS scores - Aspie score 131/200 - neurotypical score 69/200 - very likely Aspie


Carl Friedrich Gauss
Raven
Raven

Joined: 11 Aug 2022
Gender: Male
Posts: 100
Location: ...

11 Nov 2022, 1:34 pm

I had took courses of c++ and c++ object oriented then I started to learn java when I was at university. But since I gave up computer engineering after 2 years, I gave up everything about it. But after university, I tried to learn python. I can write basic codes, if I want. But I am not interested in coding anymore.

And as one of us mentioned before, AI will end these things. I am working on AI since it's almost mathematics. My main interest is advanced mathematics.

And as a habit that I took up at university, I have been using linux based Operating systems nearly for 8 years. For now, I am using Ubuntu. To be honest, I can't bear windows. As if it is designed for NTs. And linux s for us.



Radish
Veteran
Veteran

User avatar

Joined: 10 May 2022
Age: 64
Gender: Male
Posts: 13,233
Location: UK

11 Nov 2022, 5:25 pm

Carl Friedrich Gauss wrote:
I can't bear windows. As if it is designed for NTs. And linux s for us.


Windows used to be good until Windows 8 came out which was trash. It's been no better since. It was at that time I switched to Linux Mint. Windows has since become an even bigger bloated dogs dinner of an OS.


_________________
This space intentionally left blank.


techstepgenr8tion
Veteran
Veteran

User avatar

Joined: 6 Feb 2005
Age: 44
Gender: Male
Posts: 24,182
Location: 28th Path of Tzaddi

11 Nov 2022, 5:53 pm

Some of my least favorite things to deal with, when I'm supposed to build or transfer KPI data for people, is dashboard products like HubSpot and other things which are black-boxed (unless the company has thousands of dollars per month to pay for premium membership) so there's no hope of finding things in SQL, everything's labeled and organized in sales and middle management NT logic so finding anything you need is a PIA, the same thing can be labelled multiple ways actually, and trying to pull their exports in via another API is awful for the reasons mentioned above, ie. that you don't know which table anything is in because you don't even know the table names and what comes over the API is adult-proof. I had a project like that where I did find out that I couldn't do what the client wanted but it took beating my head against the wall for several days to prove that I wasn't just missing something (largely because of how the site and programming docs are).

Along that general theme is a raft of dumbed-down products that clients ask us to maintain which are built for non-programmers but which are, reflexively, not programmer friendly.


_________________
“Love takes off the masks that we fear we cannot live without and know we cannot live within. I use the word "love" here not merely in the personal sense but as a state of being, or a state of grace - not in the infantile American sense of being made happy but in the tough and universal sense of quest and daring and growth.” - James Baldwin


MaxE
Veteran
Veteran

User avatar

Joined: 2 Sep 2013
Gender: Male
Posts: 5,264
Location: Mid-Atlantic US

12 Nov 2022, 9:45 am

Radish wrote:
Carl Friedrich Gauss wrote:
I can't bear windows. As if it is designed for NTs. And linux s for us.


Windows used to be good until Windows 8 came out which was trash. It's been no better since. It was at that time I switched to Linux Mint. Windows has since become an even bigger bloated dogs dinner of an OS.

Windows 7 was most definitely a solid OS. Those versions that followed in the next couple of years didn't advance the state of the art very much. But I have been using Windows 10 more or less since it came out and have no complaints. I have used desktop Linux in the past and have found it challenging to get to work properly. It's the sort of thing where, if you make the effort to get it properly configured you'll feel good about yourself and it probably run more efficiently on whatever hardware you have, but lately I haven't had the time or energy to screw around with it. There was also a time when I would try to upgrade my hardware as much as possible. I would buy tower cases from screwdriver shops and add multimedia, extra hard drives (partly to be able to dual boot to Linux), extra memory, USB (which it first because available) even replacing a motherboard once. Been there done that.

As I see it, one has 3 choices:

Mac/OSX — you get a reasonably good OS but not as mind-blowing as the constant hype its creator puts out would have you expect, for which a lot of commercial software is available, but you're totally locked into one supplier for everything i.e. a monoculture and you have to go along with whatever they decide they can dish out including the cost of upgrades etc.
Windows (10/11) — a sold OS that, unlike OSX will run on any hardware on which you choose to install it plus it's by far the safest "commercial" OS ironically because promoters for OSX incessantly accused it of being vulnerable to cyberattack literally "by design". In fact some of that "bloat" is probably due to added layers of security. Unfortunately, because it will run on any hardware, if you bought a cheap, poorly-performing computer that ran Windows you might get a poor impression of Windows. Saying that Windows is meant for NTs is also ironic considering that Bill Gates who is mostly credited with making it the world's most widely used OS is popularly believed to be on the autism spectrum (although in my opinion he's far from it).
Desktop Linux — Like I said, can be great but you need to be at least somewhat of a guru or hobbyist to succeed with it.


_________________
My WP story


Carl Friedrich Gauss
Raven
Raven

Joined: 11 Aug 2022
Gender: Male
Posts: 100
Location: ...

13 Nov 2022, 1:34 pm

MaxE wrote:
Radish wrote:
Carl Friedrich Gauss wrote:
I can't bear windows. As if it is designed for NTs. And linux s for us.


Windows used to be good until Windows 8 came out which was trash. It's been no better since. It was at that time I switched to Linux Mint. Windows has since become an even bigger bloated dogs dinner of an OS.

Windows 7 was most definitely a solid OS. Those versions that followed in the next couple of years didn't advance the state of the art very much. But I have been using Windows 10 more or less since it came out and have no complaints. I have used desktop Linux in the past and have found it challenging to get to work properly. It's the sort of thing where, if you make the effort to get it properly configured you'll feel good about yourself and it probably run more efficiently on whatever hardware you have, but lately I haven't had the time or energy to screw around with it. There was also a time when I would try to upgrade my hardware as much as possible. I would buy tower cases from screwdriver shops and add multimedia, extra hard drives (partly to be able to dual boot to Linux), extra memory, USB (which it first because available) even replacing a motherboard once. Been there done that.

As I see it, one has 3 choices:

Mac/OSX — you get a reasonably good OS but not as mind-blowing as the constant hype its creator puts out would have you expect, for which a lot of commercial software is available, but you're totally locked into one supplier for everything i.e. a monoculture and you have to go along with whatever they decide they can dish out including the cost of upgrades etc.
Windows (10/11) — a sold OS that, unlike OSX will run on any hardware on which you choose to install it plus it's by far the safest "commercial" OS ironically because promoters for OSX incessantly accused it of being vulnerable to cyberattack literally "by design". In fact some of that "bloat" is probably due to added layers of security. Unfortunately, because it will run on any hardware, if you bought a cheap, poorly-performing computer that ran Windows you might get a poor impression of Windows. Saying that Windows is meant for NTs is also ironic

--- This is like "straw man fallacy" because I said "as if". Because to me, linux is more sensory friendly and simple (except its coding/terminal part.). And we have no way of knowing that how Bill Gates used his windows.

considering that Bill Gates who is mostly credited with making it the world's most widely used OS is popularly believed to be on the autism spectrum (although in my opinion he's far from it).

What does this mean? : " he's far from it"


Desktop Linux — Like I said, can be great but you need to be at least somewhat of a guru or hobbyist to succeed with it.


Yes, using linux requires knowledge at some level, and windows does everything for you, but this can be a problem for someone who wants to know what computer is trying to do. Or someone who wants to be in control. But i don't want to write any more on this. I am not an advocater of any OS or company. I respect any user of any OS.

I just wanted to say that I am using dual boot 2 OS. Since sometimes (very rare) I need windows. Every time I start to use it, it gives me a headache and anger.



Fenn
Veteran
Veteran

User avatar

Joined: 1 Sep 2014
Gender: Male
Posts: 2,456
Location: Pennsylvania

13 Nov 2022, 8:38 pm

Stupid Computer Tricks.
At one time I had Linux on my desktop at work, Linux on my desktop at home, Linux on my laptop, Linux on the machines (hundreds and thousands of them) at work, and even Linux in my kitchen (on a Gateway All-in-One) for the kids to use for homework.
I got tired of the panicked phone called from my wife at 3 o'clock because the OS had updated automagically and now the video card didn't work anymore. I got tired of explaining why the web site the kids were assigned for homework didn't work not because Linux couldn't handle it, but because the page designer had somehow only tested it on Windows with IE.
My feeling is:
Apple has support for home users if you pay for support. Windows is great if you are a Windows OS certified admin, or work for a big company where they hire one (or a dozen or more).
Linux is great if you are a broke college student who has more time than money, or if you are a bachelor geek who is the only user of Linux, you break it, you fix it.
If you are a father of three, with a wife who wants things to "just work" you might do what I do: go with Apple. Especially if you have more money than time.
Especially now that IE is dead and Chrome is king.
Working for a company that lost many dollars and hours on Windows ransomware, I am not of the opinion that Windows is more secure than Apple MacOS.
And - iPhones work better with Apple computers. If I want to develop code for iPhones I can do it on a mac Laptop better than on Windows.
And my Mac Laptop can run Linux in a vm, and Windows in a vm (if I pay for Parallels) and can run VisualStudio in Windows in a vm (if I am willing to pay for it). And it is much easier to install FOSS from Linux on Mac OS with brew than it is to install on Windows with Cygwin (but Windows with Cygwin isn't bad).

But if you LOVE Windows or you LOVE Linux please go on loving it.

I don't want a OS war.


_________________
ADHD-I(diagnosed) ASD-HF(diagnosed)
RDOS scores - Aspie score 131/200 - neurotypical score 69/200 - very likely Aspie


wbport
Sea Gull
Sea Gull

User avatar

Joined: 16 Sep 2012
Gender: Male
Posts: 220

13 Nov 2022, 11:18 pm

COBOL was my number one language. IBM Assembler, C, and JavaScript also came in handy on the job or as a hobby.
Some of my code is here but take off the last slash and "programmer" to see the rest of it.