• Forum
  • Lounge
  • Is there a schmo out there, who spent hi

 
Is there a schmo out there, who spent his life learning something that is tottaly redundant and fruitless now?

Imagine you spent ten years of your life say mastering shell on the Amiga, and when the Amiga finally went to administration you sit in the local jobcenter or welfare office thinking to yourself, my god i dont know anything relevant anymore im pretty much redundant and know nothing relevant to programming. (being 'made redundant' is a horrible expression and i think it may soon be considered un-pc )

Well what i want to know is if i spend several years studying computer science is there anyway of knowing what field will soon become redundant and unneeded?

I know theres the unkown factor in the future like someone invents sphycic networking and we no longer need a computer or modem (hacking could be really really fun)
Last edited on
Tbh, most skills overlap, so if you learn one thing (the Amiga shell), it makes it easier to learn something else (the Windows/DOS shell for example).
spose, im learning c# sooo much faster than i did with c++, thanks to c++ hardness iguess, im still light years away from what jackson marie is doing, and my ultimate goal...creating awsome OS
Computer science is a large field, there is much more than just programming. I recommend going out and learning about the varying aspects of computer science. There's database admin, sysadmin, networks (engineering and admin), software engineering, robotics, system design, analysts, etc. If you're interested, this article is pretty popular and gets linked around a lot. It's a good read.
http://www.joelonsoftware.com/articles/CollegeAdvice.html
well, Amiga is a computer like a pc. It obeys the same rules.

Saying that when you mastered Amiga you can't master PC is like saying
"I always used a hammer with a red ribbon. Now there're only hammer with green ribbon. I don't know anymore what to do with this new hammer"

devonrevenge wrote:
im still light years away from what jackson marie is doing
hope so, (s)he's only doing nonsense
fanks Biscuit i read it all
If you want to read something pessimistic.
http://www.bloomberg.com/news/2012-04-22/software-engineers-will-work-one-day-for-english-majors.html

Some advice on how not to be a statistic,
The Pragmatic Programmer, by Andrew Hunt and Davit Thomas

...Your knowledge and experience are your most important professional assets.

Unfortunately, they're expiring assets. Your knowledge becomes out of date as new techniques, languages, and environments are developed. Changing market forces may render your experience obsolete or irrelevant. Given the speed at which Web-years fly by, this can happen pretty quickly.

As the value of your knowledge declines, so does your value to your company or client. We want to prevent this from ever happening.

Your Knowledge Portfolio
We like to think of all the facts programmers know about computing, the application domains they work in, and all their experience as their Knowledge Portfolios. Managing a knowledge portfolio is very similar to managing a financial portfolio:

1. Serious investors invest regularly--as a habit.

2. Diversification is the key to long term success.

3. Smart investors balance their portfolios between conservative and high-risk, high reward investments.

4. Investors try to buy low and sell high for maximum return.

5. Portfolios should be reviewed and rebalanced periodically.

The rest of the section explains these analogies and talks more about how to manage your knowledge portfolio.
hope so, (s)he's only doing nonsense
:O thats spoonlicker esque!

EDIT: that he she thing and the non-sense, and when i asked her in private if she was spoon licker and she said "i am
not
a spoonlicker" as if she would have known that a spoonlicker is a someone and not a thing!!

and this whole time she seems to be giving this, english isnt my first language vibe...it just doesnt scan ya know.

Hear that JM I still suspect your secretly spoonlicker!

EDIT: @seeplusplus yep this is what i was worried about but as biscuits thing said knowing computer science and c is the basis of everything software any ways
Last edited on
@OP
Yep, programming...and I keep at it because you never know when it will stop being fruitless again.
Sigh...

Actually, I'm very busy. So devonrevenge, is it your revenge? Don't make a new thread then speak nonsense here, and your doubt simply is I usually check and post by my small mobile.
Last edited on
srykson if your not spoon licker no i genuinley worried bout some aspects of thinking about doing it for work.

@ BHSpecter, good attitude sort of, and even if you loved amiga shell i guess you could still play with it if you like.
Heh, Had to make some kind of joke, sadly my humor is...well...poor to say the least.
closed account (S6k9GNh0)
According to OSdev, a lot of people spend years making an OS just to realize their OS is crap. That must suck...
Last edited on
I started thinking about making my own OS, then realized I had to learn a butt load more than I thought and just put it on the shelf for good.
@computerquip
Not really, it's a nice feeling when all the code running on a machine is your own, even if all it can do is respond to interrupts. My OS projects have all failed so far and I keep doing it because it's fun. You just need to have the right attitude.
According to OSdev, a lot of people spend years making an OS just to realize their OS is crap. That must suck...

Of course your operating system is going to suck. Modern operating systems are the work of hundreds or thousands of people and some of them are made of millions of lines of code.

Chrisname's operating system probably wont be used by anyone for any practical purpose, but the experience might help him get a job making a lot of money working for a top tier software company some day.
I wanted to do OS dev, but I just have never learned assembly good enough to do that and my C knowledge isn't all that great. From what I understand the books for C++ avoid C methods for doing things and makes it so you would have to learn C or at least aspects of it like printf/scanf instead of cout/cin.
You can write an OS in any Turing-complete language*, and C++ is no harder than C once you get it to compile, and IIRC all you have to do is make sure the constructors and destructors get called, which only involves writing a special linker script (which you have to do anyway). However, you need to know the assembly language of your chosen platform, that's pretty much unavoidable**.

I recommend you give it a go. It's not that hard to get started, there are tons of tutorials on the Internet you can follow. They'll teach you to make an absolutely basic kernel, and then, once you get the hang of things, you can write your own from scratch (by reading documentation, which, by the way, is really boring. Thankfully a lot of the time you can avoid it by finding pages on the OSDev Wiki and other sites).

* But with interpreted/managed languages you have to write the interpreter/virtual machine yourself, or else write a native compiler or find one that someone else has written. It's a good idea, therefore, to stick to languages that already have native code compilers, because not only are compilers about as difficult as operating systems to write, if there are any bugs in your compiler then there will be bugs in your operating system, and you'll have to debug two programs instead of one to figure out what's causing each problem (and believe me when I say that debugging an operating system is the worst programming experience I've ever had). It's even worse if your compiler is for a language of your own design, because that adds a third possible source of bugs - language design flaws. So, to summarise, stick to existing languages with widely-used native code compilers, such as C and C++.

** Unless, maybe, if you use a managed environment written by someone else. They do exist, but as I pointed out in my previous footnote, it's a bad idea because it introduces another potential source of hard-to-diagnose bugs. Just to reiterate, debugging an operating system is the worst thing I've ever had to do in programming, and that was in C and assembly using gcc and nasm, so I knew any bugs were in my own code.
Last edited on
Topic archived. No new replies allowed.