Better at 90's programming than modern programming?
Back in the good old days of DOS, programming was simple: you write a set of instructions, it goes through them one by one, and if there is a problem during the run it will take you right to the line where the error occurred, and you could figure out what led to it. Granted I didn't really know how to use functions and classes to organize code then, but they did exist.
But now, more and more complicated development environments make debugging nearly impossible. Granted that putting code into functions and classes improves organization and readability, but I've worked with several environments that don't even (by default) tell you what line in your program (or the last line you wrote before some pre-made code deeper down) it was on when a run-time error occurs! This leaves me either running it again and again with debugging hacks to find the problem, or else trying to find out how the heck to use the special-purpose "debugger" for this.
I've aced programming assignments in school, but school is not likely to teach me anywhere near 100% of what I'll need for the real world. My professors might be giving me TOO many functions, classes, etc. to start with!
Although if I can get a job I'm unlikely to be programming anything completely from scratch, at least.
_________________
Your Aspie score: 98 of 200
Your neurotypical (non-autistic) score: 103 of 200
You seem to have both Aspie and neurotypical traits
AQ: 33
I understand, but I think it's more you. Modern IDEs should be able to provide extensive information about state at the point where an exception is thrown, etc. I disagree with the poster who said he can work more efficiently at the command line - having the big picture laid out in a GUI with expandable tree views etc is a big deal.
Well, I didn't really say it was more efficient, just that I prefer not to use IDE's.
Also, it's rude to refer to random people as "he." Not everyone on the internet is a guy.
Software development advances by introducing greater levels of abstraction. The low-level libraries in C were built upon by higher-level abstractions in (say) Java, where string-handling functions were augmented with threading and networking libraries. From here, even higher levels of abstraction like J2EE were introduced. On top of these high-level abstractions came higher-level abstractions like Spring and Hibernate. On top of these are even higher levels like MVC flow of control. In some ways, operating at a higher level of abstraction makes typical jobs easier to do. Hibernate is easier than hand-coding database storage and retrieval code for every object. On the other hand, the whole exercise becomes a house of cards. Java has like three or four I/O libraries now. J2EE is a breathtaking mess, with JSP, JSTL, EL, and so on. JSP pages which have to use JSTL/EL and also emit JavaScript, jQuery, and CSS are almost incomprehensible. (Is that a JSTL tag, or an emitted markup tag? Is that an anonymous JavaScript array, or a CSS list?) But there's no going back. Remember, there was a time when high-level abstractions like C's string library were considered too slow.
I've always used Emacs and the command line. What's strange is that everyone talks about how fast I complete tasks. They're amazed at how quickly I get things done. I'm not that fast, I just don't have layers between me and what I'm trying to do. I think the IDE was designed to slow programmers down and interfere with getting work done.
This is sometimes true, but you're also talking about Java, which isn't a particularly common language outside Android and academia. Many advances are nothing but refactoring or accounting for hardware advances, and some higher levels of abstraction come with zero or near zero performance penalty (intelligent use of C++ templates, e.g. RapidXML).
I will say that I very often forego abstraction libraries when developing products if they don't offer very significant and tangible benefits, and I produce very efficient (likely more efficient than what people were producing in the 90s) software that is at least as feature rich as anything else you'd find these days. There are times that I develop things that people may think of as "reinventing the wheel", that really only require a couple thousand (or fewer) lines of code. It hasn't really been a problem, and maintenance is easier than dealing with an external library.
ruveyn
C has almost universally given way to C++ and in some cases C#. People are using toolkits like .net for rapid development.
JavaScript is powerful enough that browser development is dominating.
People are accomplishing things in "languages" like CSS that aren't even really programming languages, that you would have needed a programming language for in the past.
Realtime graphics is pretty much all done with shaders.
I've always used Emacs and the command line. What's strange is that everyone talks about how fast I complete tasks. They're amazed at how quickly I get things done. I'm not that fast, I just don't have layers between me and what I'm trying to do. I think the IDE was designed to slow programmers down and interfere with getting work done.
If you're that efficient, it's probably more you than emacs.
I don't have to fight with my IDE that often, unless it's XCode.
Well, I didn't really say it was more efficient, just that I prefer not to use IDE's.
Also, it's rude to refer to random people as "he." Not everyone on the internet is a guy.
Whatever
Java is widely used in the business world, particularly Java 2 Enterprise Edition and IBM's WebSphere. With Struts and either Spring or Hibernate, J2EE is a major platform for business workflow automation.
But now, more and more complicated development environments make debugging nearly impossible. Granted that putting code into functions and classes improves organization and readability, but I've worked with several environments that don't even (by default) tell you what line in your program (or the last line you wrote before some pre-made code deeper down) it was on when a run-time error occurs! This leaves me either running it again and again with debugging hacks to find the problem, or else trying to find out how the heck to use the special-purpose "debugger" for this.
I've aced programming assignments in school, but school is not likely to teach me anywhere near 100% of what I'll need for the real world. My professors might be giving me TOO many functions, classes, etc. to start with!
Although if I can get a job I'm unlikely to be programming anything completely from scratch, at least.
Are you nostalgic for interpret based languages like BASIC and Quick BASIC? If so, pick a scripting language and run with it.
I've always used Emacs and the command line. What's strange is that everyone talks about how fast I complete tasks. They're amazed at how quickly I get things done. I'm not that fast, I just don't have layers between me and what I'm trying to do. I think the IDE was designed to slow programmers down and interfere with getting work done.
I've also found that emacs and the command line greatly increase productivity. It helps that emacs is so customizable that you can get it to behave pretty much however you want.
The only disadvantage to using these tools is the greater learning curve (which tends to scare a lot of people away). IDEs are designed with average and below average programmers in mind and for this reason they are fairly easy to use (otherwise these users wouldn't use them). However, they are not as efficient once you become a proficient emacs user.