Is Programming Becoming a Lost Art?

Pages: 1234
Wow. Huge thread. I'll be honest, I read about halfway through then I got itchy typing fingers.

Framework, it's easy to tell from your posts (in other threads as well as this one) that you're a good programmer and an intelligent guy. However, I just can't understand the foundation of this argument.

I think if you apply the same logic to other aspects of life, it sounds just as odd. For example, take cars. I know how to drive a car. I know what to do in certain situations. I know when to change gears, which gears to change to and how to operator the indicators. I have no desire to know how any of those components work and the absence of such knowledge doesn't determine how good a driver I am.

Bad programmers will write bad code. Good programmers will write good code. The definition of good and bad code is a matter of opinion and will vary across the board.

Is it becoming a lost art? Absolutely not. In terms of art, what is the actual art? Is it the final painting? Or is it strokes of the brush involved? Is it the process or the product? There are times when I've opened up a software application, written by someone far smarter and far more skilled than I am and have a real moment of appreciation for what it does. An example of this was the first time I used Google Goggles a few years back or the first time I used iOS. A lot of hard work has went into creating an excellent product that functions how I'd expect it to. Isn't that surely the aim?

EDIT: Consideration - As a programmer, would I appreciate any of those products more if they were primarily written in a more primitive language?

Answer: Probably not, no.
Last edited on
closed account (z05DSL3A)
iHutch105 wrote:
I think if you apply the same logic to other aspects of life, it sounds just as odd. For example, take cars. I know how to drive a car. I know what to do in certain situations. I know when to change gears, which gears to change to and how to operator the indicators. I have no desire to know how any of those components work and the absence of such knowledge doesn't determine how good a driver I am.
The logic is not quite the same. Yes to drive a car you don't have to be a mechanic but can you say you are a mechanic if all you can do is drive the car (no matter how well)?
The car example might not be perfect, but it is correct. The point is that not everyone wants/needs to get 100% out of his car. Maybe if you study your engine you'll find the perfect way to minimize your fuel usage, while onboard measure instruments only provide a rough estimate. Does that mean everyone should spend years studying his engine, and every engine he'll ever use?

A good cook can still use mixed herbs (or do you mix your own curry? Your own chicken herbs?), even though a master chef might add each ingredient manually to get the perfect blend.

Long story short: not everyone needs such fine-grained control. A language should offer both and, to my limited knowledge, I'd say C++ offers a wide range of options. Whether it's sufficient is a matter of opinion, but Framework's examples make me believe he's greatly overestimating the "problem".
closed account (zb0S216C)
It seems some people who've participated in this thread are somewhat confused as to what my main point is. Here's why I started this thread:

Back in the days of assembly, programming was difficult, and developing trivial programs became a tedious process. However, during those days, a programmer would have to study the architecture of the CPU he/she was using. In my eyes, programming that low to a CPU was definitely a skill, and no doubt an art in some programmers' eyes.

As years passed, programming languages become more simplistic & automated. This enabled programmers to distance themselves from the underlying details of the architecture. At this point, programmers didn't have to learn the architecture, since it was optional. A languages progressed, programmers became less and less skilful in low-level development, and with it, a fading of the art that is programming. When I used C#, I felt so distant from the architecture, I didn't feel like I was in control, and that the compiler was doing things behind my back that I should know about (like the implicit virtual example).

To support this, xerzi posted this:
xerzi wrote:
"I overheard some other students talking about the then new "metro" (Windows 8 style UI) and the new API's that comes with it. They were marveled at the fact the guy (a demo where they wrote some code to showcase it) only wrote 5 lines of code and had a complete and running image editor."

This shows me just how lazy programmers can be, and just how automated programming has become.

So overall, the low-level art of programming is becoming a distant memory in some programmers' eyes, and the fact that programming is becoming so automated & distant from the architecture, it's almost painful.

OK, some participants of this thread may not agree with me, or they might have actually been offended. For that, I'm sorry, but my opinion stands. Now, though, I hope you understand why my concerns are with the near death of the art of programming.

Edit: I did say in my previous post that my argument is not directly aimed at any language.

Wazzak
Last edited on
closed account (z05DSL3A)
Gaminic, Okay so your cook/chef is a bit better analogy.

I get premixed spices and other ingredients and follow a recipe to cook a meal. I don't have the required skill set (the art) to be considered a chef. If you say, well you cooked the meal and it was nice therefore you are a chef, then the 'Art' of being a chef is diminished.

So can the same be said for programmers? If you use a high level language that supplies the ingredients for you and you are just blending then together, are you a programmer? What skill set do you need to be considered a programmer? Again, if the skill set to be considered a programmer is reduced, then some of the 'Art' is lost or at least marginalized.

Just so we are straight, I'm not saying that people who use high level languages are not programmers.
Last edited on
Yeah, the chef analogy is better. Wish I'd thought of it. :-)
closed account (3hM2Nwbp)
framework wrote:
As languages progressed, programmers became less and less skillful in low-level development


...and to make themselves feel better about their lacking knowledge, they have the ideology that anything low level is both dangerous and evil such that if someone uses it, they are a bad programmer and are then subjected to the hands of the angry mob of java-noids zealous followers.

I can't help but pick on Java developers, but it's so tantalizing. :)
Last edited on
closed account (z05DSL3A)
Luc Lieber wrote:
I can't help but pick on Java developers

:0( I've just started learning Java ... for Android and Cloud Computing ... but to balance it out I have also got a Raspberry Pi and a breadboard for some low level physical computing goodness.
Well, I'll be taking C this semester, and I have to also take Assembly/Computer Organization next semester, along with Java. These are all courses required to get an associates degree in CS, at my community college, and maximize your chances of getting accepted to a decent University.

After that, I'll transfer to a University, and I'm sue I will be forced to learn more low level stuff along the way.

Being that the vast majority of programmers will be unable to get a job worth mentioning without at least an associates, I really don't see any problem of programmers not understanding low level details and computer architecture.

When it comes to the way all of these people apply themselves professionally in the long run; I think some of them will get jobs where they need to program at a low level, and some will get jobs where they need to program at a high level.

And there certainly are jobs where you need to know architecture and low level details. When I was browsing for example, NVIDIA's job offers, none of the offers asked for C#, or Java, or Visual Basic, just C, and maybe a scripting language like python.

I've though it would be cool to develop GPGPU technology. I've used openCL a bit in one of my programs. So I can say, in a conversation, "I've written about a dozen Kernels!", they were kernels to run on a GPU, written in C, and were about 10 to 40 lines a piece, but ...

Anyways, to sum it up; I think that the art of low level programming is not fading, it's still used all over the place. I think the vast majority of Computer Scientists / Software Engineers know low level stuff, or at least were forced to learn it at one time. I just think it should be used when there is a point to it, and not when it's effectively pointless.
Last edited on
Framework wrote:
A languages progressed, programmers became less and less skilful in low-level development


This is totally false.

There isn't any shortage of programmers who understand lower level architecture. If anything there's more than ever. Yes the percentage of programmers that have that skill may have gone down, but that's a meaningless statistic because virtually anyone can be considered some kind of programmer these days.

But maybe that's your point? Maybe you think hobbyist kids who don't really care about professional/serious development and are only interested in pumping out quick programs for the novelty... that they're somehow ruining it for the rest of us. If that's the case then the only thing I can tell you is to grow up. That's basically like saying "I was a programmer before it was cool" -- nobody cares.


(yes I know you aren't singling out C#, but I'm using it as an example here):
You seem to have it in your head that coding [well] in C# doesn't require you to understand what C# is doing behind the scenes. That is absolutely false. In fact the exact opposite is true... to really use the language well, you really have to have a firm understanding of exactly what it's doing. The same is true of C++, and of C, and even of assembly.

In a sense that makes languages like C++, C#, Java, etc much harder to learn and master. They're significantly deeper and much more complicated.


I didn't feel like I was in control, and that the compiler was doing things behind my back that I should know about (like the implicit virtual example).


If you don't feel like you're in control, you don't know the language well enough.
If you think the compiler is doing things behind your back that you should know about... you're right... and you should know about them.

At least... that's if you want to master the language.


So overall, the low-level art of programming is becoming a distant memory in some programmers' eyes, and the fact that programming is becoming so automated & distant from the architecture, it's almost painful.


Then again... (and the more and more I read this thread the more I realize this might be the actual underlying issue).... maybe you're just a dinosaur. And it's painful because you don't like how competitive it is now and you want things to go back to how they were 30 years ago when making any program (no matter how simple) was considered wizardry.

There are a million good reasons why you shouldn't work directly with the architecture, and why you should use existing libraries when available. I've listed some of them before, but in case you missed them... here are some again:

- Portability
- Speed of development
- Maintainability of code
- Code reusability
- Minimization of bugs
- Ease of communication between developers on large teams


It's not a matter of programmers being lazy. It's not a matter of them not understanding what the languages/libs are doing. It's a matter of being productive. Reinventing wheels and inlining assembly to squeeze out the best performance of the architecture isn't being productive -- it's a waste of time.

So while you are working tirelessly to hand craft every detail of your 1 program by tweaking bits with a hex editor -- your neighbor Biff spends the same amount of time pumping out 5 programs that are all just as functional as yours by using higher level languages and existing libraries.

Although you do have one advantage... your program is so well optimized that it will run seamlessly on a 500 Mhz Pentium machine running Win 95, whereas Biff's programs will stammer and choke on that platform. Too bad nobody runs those anymore.


So yeah I guess it is kind of painful to watch these kids whiz by you. Sounds like you better get with the program... because technology moves really damn fast and it's just going to get worse over time.


EDIT:

Also I just want to reiterate. Coding well now isn't any less of an artform than it was before. It's still very difficult (probably even more difficult than it ever was), and it still takes a lot of knowledge and creativity.
Last edited on
Too bad nobody runs those anymore.

;-)

Coding well now isn't any less of an artform than it was before.

I have a simple analogy: low-level = marathon, high-level = formula one.
Framework doesn't like that F1 drivers are faster while being less fit. He also scoffs at how they need a race track.
closed account (z05DSL3A)
Disch wrote:
maybe you're just a dinosaur.
or maybe you are one of the giants that Biff want to stand on the shoulders of.

"we are like dwarfs on the shoulders of giants, so that we can see more than they, and things at a greater distance, not by virtue of any sharpness of sight on our part, or any physical distinction, but because we are carried high and raised up by their giant size."
"It has been said that physicists stand on one another's shoulders. If this is the case, then programmers stand on one another's toes, and software engineers dig each others' graves." -- Unknown
[Quote on 5lines of code for a working UI] ... This shows me just how lazy programmers can be, and just how automated programming has become.

I don't see the problem with this. Why should every programmer be forced to reinvent the wheel? Yes, it might be more artful to do everything yourself, but what's the point. It's a waste of time and chances are the results will be shit anyway.
Topic archived. No new replies allowed.
Pages: 1234