4

Growing Up With Computers

 3 years ago
source link: https://hackaday.com/2021/06/19/growing-up-with-computers/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Growing Up With ComputersSkip to content

My son is growing up with computers. He’s in first grade, and had to list all of the things that he knows how to do with them. The list included things like mousing around, drawing ghosts with the paint program, and — sign of the times — muting and unmuting the microphone when he’s in teleconferences. Oh yeah, and typing emojis. He loves emojis.

When I was just about his age, I was also getting into computers. But home computers back then were in their early years as well. And if I look back, I’ve been getting more sophisticated about computers at just about the same pace that they’ve been getting more sophisticated themselves. I was grade school during the prime of the BASIC computers — the age of the Apple II and the C64. I was in high school for the dawn of the first Macs and the Amiga. By college, the Pentiums’ insane computational abilities just started to match my needs for them to solve numerical differential equations. And in grad school, the rise of the overclockable multi-cores and GPUs powered me right on through a simulation-heavy dissertation.

We were both so much younger then.

When I was a kid, they were playthings, and as a grownup, they’re powerful tools. Because of this, computers have never been intimidating. I grew up with computers.

But back to my son. I don’t know if it’s desirable, or even possible, to pretend that computers aren’t immensely complex for the sake of a first grader — he’d see right through the lie anyway. But when is the right age to teach kids about voice recognition and artificial neural networks? It’s a given that we’ll have to teach him some kind of “social media competence” but that’s not really about computers any more than learning how to use Word was about computers back in my day. Consuming versus creating, tweeting versus hacking. Y’know?

Of course every generation has its own path. Hackers older than me were already in high-school or college when it became possible to build your own computer, and they did. Younger hackers grew up with the Internet, which obviously has its advantages. Those older than me made the computers, and those younger have always lived in a world where the computer is mature and taken for granted. But folks about my age, we grew up with computers.

This article is part of the Hackaday.com newsletter, delivered every seven days for each of the last 200+ weeks. It also includes our favorite articles from the last seven days that you can see on the web version of the newsletter.

Want this type of article to hit your inbox every Friday morning? You should sign up!

Posted in computer hacks, Hackaday Columns, Rants, SliderTagged children, computers, education, newsletter, retrocomputing

Post navigation

57 thoughts on “Growing Up With Computers”

  1. Danjovic says:

    Z80 Assembly language is a tasteful snack for the eager mind of a teenager. Those were good times!

    1. Darren Jones says:

      Z80 machine code convinced teenage me that I had no future programming computers. It took me 30 years to get over that and realise I did have what it takes.

    2. Chris J says:

      Why would you teach your kid a dead programming language? There are things I hate about Python, but if my kid wants to learn about programing, the first language she’ll learn is Python. It runs on everything, it’s easy to learn and there is a ton of library’s and example code.

      1. John says:

        Either python or Lua. Python is probably better, but Lua got me into real programming.

        1. Matthew Trey says:

          Lua? Really? I mean yeah python is the new basic but Lua?

          1. chango says:

            Lua is used as a scripting engine in many games. If you want to create advanced content in Roblox, you need to know Lua.

            Going back to the ’80s home computer era, kids learned BASIC to make their own games or modify existing ones, not for intrinsic reasons. I wonder how many software engineering careers were started by putting lewd messages into NIBBLES.BAS?

        2. And it’s easy enough to embed a Lua interpreter in your C code. Bit surprised folks don’t do this more with microcontrollers.

          1. jonmayo says:

            I used Lua in u-boot to do factory diagnostics. Instead of releasing new binaries when something was added or changed on the board I could update what I called a “config file”. which was a pretty lengthy Lua script.

        3. Johnix says:

          Actually Python requires you to wait till they master written language.

          For 1st graders, Scratch is the way to go.

      2. Matthew Trey says:

        Dead programming language? What are you a millennial?

        1. Inhibit says:

          The phrasing did give me a mental picture of Z80 specifications written on papyrus.

      3. Foldi-One says:

        The older more simple computers like the Z80 and its most bare metal languages are well worth learning – as they are simple enough to comprehend how the CPU processes it all – the nuts and bolts of how a computer works, which is very important knowledge.

        Rather than the pile of abstraction and abstraction of a ‘modern’ language. Though then learning one of those should be much easier.

        1. Michael Black says:

          I’m not so inclined to believe the CPU matters.

          But nobody was doing “bare metal”. The Altair had the front panel, and I recall it could be single stepped. The KIM-1, had a great monitor in ROM. Not onky to put code in memory, but yiu cpukd set break points and single step and there wereroutines to get bytes from the keypad and display results. It’s been 40 years, but I think you could check registers and stwtus code.

          It’s that environment that made it friendly. You could put in one opcode, and any operand, and see the results, and then string it with other instructions to write code, still not needing to go “bare metal”.

          That sae sort of thing could happen today if a similar monitor ran under Linux.

          The real difference is you can’t hand assemble with today’s CPUs.I

          I’ve never used C much, so when I needed a simple program a few years ago, I relearned the same way. Extra lins to display values, and to print “here we are” so I coukd follow my errors.

          1. Foldi-One says:

            I would say it matters enough to be worth learning the more bare metal language for something simple enough to actually understand – as that means you should have a far better grasp on what your program actually needs to do and how it does it – which should mean more efficient coding choices and a better intuitive grasp of potential security issues and bugs that all the abstraction layers hide from you while not actually always preventing.

            I’ll admit I haven’t done so myself – my understanding of this, which is still more limited than I would like, comes from the other end. Learning FPGA and the softcore cpu development – which I’m still far from really a master of, HDL’s certainly feels like lots of work to get your head around.

        2. Daniel Dunn says:

          Anything simple enough to understand every part of is probably obsolete or at least legacy. It’s important that someone understand how CPUs work, but I don’t need to know more than the most basic level of that, because I’m not a CPU designer and that’s an implementation detail I don’t see, because I don’t code assembly.

          I really don’t buy they whole specialization is for insects thing. The only way to be able to do a bunch of different things is to practice them, and there are only so many hours in the day. Nothing as advanced as the modern world would happen without some degree of specialization and a whole lot of acceptance of black boxes.

          I’d rather have something that works perfectly that I don’t need to understand, than something that breaks down every few years but that I know every detail of.

      4. jonmayo says:

        Every version of Python eventually dies. But you’re right about lots of online resources, and it’s not just examples but other people to talk to about it.

        I think the draw to some of these 8-bit systems as a starter is they were smaller and easier to hold within a single brain. They were less of an opaque magic box that a modern language with garbage collection and type inference offers.

        I’m of the mind that people who are interested in learning stuff should learn all the different systems. Even if they just sit in a short video lecture of how some of these old systems worked. Rather than dive right into development on a Z80.

        I’d lean more towards an Arduino than a Z80 for kids looking to get some experience with low level and systems programming. It’s popular enough that resources are available. C is a little hard to learn but it’s cut-and-paste friendly and most Arduino programs are relatively simple and avoid complicated data structures and exotic language features. Running without an OS is both liberating and challenging, it not having files, networking sockets, graphics, and multitasking easily available is limiting, but it also means you don’t have the pressure to dive into doing those things right away.

  2. Robert R Ando says:

    We didn’t grow up with computers, we had to invent, then build them.

  3. therogerv says:

    1975 to 1995 is the statutory Golden Age of personal micro computing – it was personal computing’s equivalent of the Cambrian Explosion. Thereafter computers settled down and really, became sort of boring (even though vastly more powerful)

    If you lived during the years when BASIC was in the ROM in most personal computers and was there at your fingertips when the computer was turned on, you lived during a rather special era – consider yourself lucky to have witnessed it first hand

    1. Earl Colby Pottinger says:

      And not just BASIC, a few had Forth, at-least had a version of APL, and then there was the SuperPet with BASIC, APL, FORTRAN, COBOL, and Assembler at a flick of a switch.

      Always there, always available as soon as you turn on your machine.

      I miss that, the languages were slow compared to what you run today, but you could have ideas and changes in seconds,

    2. Greg A says:

      yeah i really wonder how (and what) younger generations learn. the computer was so finite then, a small reference library could really cover everything you needed to know. now, you have to tune out the overwhelming complexity and focus on one element of it. it’s a *lot* easier to get things done, but you get a totally different view of the computer and you get used to the majority of the layers being completely opaque to you.

      but man, i did just do malloc(0x100000000ULL) the other day. the future truly is amazing

      1. This, too! I remember running into Bill Gates’ (apparently apocryphal: https://www.computerworld.com/article/2534312/the–640k–quote-won-t-go-away—-but-did-gates-really-say-it-.html) 640K after I read an article about Markov Chains in Scientific American. Now, with smart programming, one could have worked around the sparse matrices, but I was like 15, damnit.

        Nothing says YOLO like calling out a “ULL” in a malloc call. That’s just badass!

        1. Michael Black says:

          Why would Bill Gates have said that? He didn’t design the IBM PC, he just sold IBM BASIC and the operating system.

          People did make weird pronouncements, so I can’t even remember if I heard Bill’s back then. But every time it’s come up in recent years, it just hits me that he wasn’t thedesigner, so why would he say it?

          It’s on the same level as people thinking Steve Jobs was a technical type. Or tye Apple II was the first “personal computer”.

          1. chango says:

            He’s a businessman, and the success of his business was loosely tied to the success of the PC platform. Of course he’d want to downplay an obvious architectural limitation.

      2. Bob says:

        I’m 16. The amount of platforms, languages, libraries, and IDEs is overwhelming, but I now realize if it works, it works.
        The other thing is that with the internet showing every amazing project everyone else has done, it makes it seem like anything less than a groundbreaking project isn’t worth doing.

        1. Hirudinea says:

          Ah come on, you’re 16! Consider what level your skills were 20 years ago and what that will be 20 years from now, keep working and you’ll be making groundbreaking stuff.

        2. Duh says:

          Not only groundbreaking, but also with a social presence and video-editing skills that folk love, and so much more. And then, once you’ve dedicated so much into that groundbreaking thing, that took years to achieve, you’ve got to come up with ever more, on ever tighter time constraints, lest your fifteen minutes of fame be lost to bit-rot, and you’ve got to start all over again.
          Yep… you’re getting it!

        3. Neil says:

          “The other thing is that with the internet showing every amazing project everyone else has done, it makes it seem like anything less than a groundbreaking project isn’t worth doing.”

          I hear that. I think that’s part of the appeal of retro-computing: That I could do something that was never done back when they were new technology.

        4. Inhibit says:

          It usually takes a little reading to get to the part where they’ve spent a lifetime getting the experience then four years of downtime development.

          Everything looks like magic when all you see is the end result. Take it as inspirational rather than discouraging.

        5. my2c says:

          – I hear you on ‘internet showing every amazing project’ making your own seem not worth the effort, and it does feel like that has become a great downside to of all of the information at our fingertips.
          – I grew up more in a time (along the lines of this article) when you could get a book from a library to figure out how to do something technical with a console, or later PC – but not necessarily complete your idea. You felt like your ideas were novel, creative, worth pursuing, learned a lot in the process of executing them, and had satisfaction of feeling like you created something new and great when you were done – leaving you wanting more.
          -Today you can get an idea that is novel to you, google something, and see 20 people have had same/similar ideas, built it already, and probably a few of them have built something vastly more refined and complex (quite possibly having less limited time and resources to pursue than yourself). It takes much if not all of the fun out of it, at least for myself. Your fun new idea for a build is now just a copy of someone else’s documented work, and moreso following instructions than creating and problem solving, many times to just get to something less impressive than you found along the way (not necessarily due to your skills and abilities, but realistic resource and time constraints again). Even if you have the motivation to build your idea at this point, it’s hard to look at the outcome as an accomplishment, rather than ‘not as cool’ and falling short of the similar projects you came across while researching.
          – I’m not sure how we best deal with this, but it does seem to be a great hinderance for the younger generations to ‘catch the bug’ of building, creating, and learning – not to mention an issue for us ‘older’ folk also.
          – I guess the main thing to keep in mind is every challenge you take on expands your own skills and knowledge base, regardless of if someone else has done it already, or how well they did it. At least in builds for our own enjoyment, maybe we need to go our of our way a bit sometimes to avoid looking at similar projects – and if you run into a stumbling block, look for answers for that very specific stumbling block, and not similar projects/solutions in general. Trying to build a rabbit repelling water-turret for your garden? – Keep the googling to servo pinouts, image recognition, etc. You, and likely your friends and family will likely still be impressed with what you come up with, not having seen similar. Only if you submit it to HAD will someone point out ‘so and so made this one 10x better, and with some better design decisions’ and deflate your ego :-).

          1. David says:

            I’m 30ish so I grew up with the rise of the modern internet rather than computer software/hardware.

            I think the current proliferation of projects and information is great!
            If you want to learn about metal working and casting, fishing, sports, electronics, gardening, home repair, car repair, woodworking, programming, cooking, baking, animal care, or whatever; there are still the print resources but with a quick internet search you can find others out there with the same interests to learn from and grow with.

            So yeah, for me it’s important to understand ‘why’ something works but redoing the research from the ground up (and the associated time commitment) is less important to me.

            I might not be an inventor but I would rather be something of a renaissance man and know/do lots of different things than be an expert in a specific area.

            Oh yeah, home brewing.

            I wouldn’t worry about the younger generation, just keep putting the opportunities out there and they will take advantage of it. There will always be people who want to take advantage of it even as there will always be people that only know how to interact with a phone interface and never know (or need to know) how to use other input devices.

            Just some thoughts.

        6. Tim Trzepacz says:

          It isn’t a competition.
          Do stuff because you want to.
          Do it your way.
          Make what interests you.
          Do it because it’s fun!
          Not what you think will get you status or views or will one up somebody else.

  4. Michael Black says:

    I have no idea when I first noticed TV, it was there, and not a novelty to.me.

    But about 1969, there was a story in the paper about two kids around my age who built their own computer. That’s when I wanted my own. (In retrospect, they must have built one of those demonstrators out of straws or cardboard, but I didn’t know that at the time.)

    I couodn’t afford a computer in 1975, but it was part of my life, I was fifteen. I have always had a comouter since May 1979.

    I think there’s a difference between wanting something, seeing it develop, and something that’s everywhere.

    Lots of talk about “digital natives”, but they had nothing to do with it. It’s an appliance, their skill is mostly social. They didn’t create this world, it was created for them, and created as a thing where they could easily participate.

    Much of the population waited for til it was a Disneyfied space, easy to use, and controlled. E en groups that had webpages in 1996 weren’t really using the medium effectively, they were just markers. They waited until facebook and twitter was already being used by the masses to jump in.

  5. Darren says:

    I feel immensely fortunate to have grown up with early computers. By necessity we had to learn about how the computer works. Of course the immense power that people have at their fingertips today means that a people can get amazing things done on a computer without any real knowledge of how things work inside the computer. The problem is without that basic understanding the ability to troubleshoot is lost when something doesn’t work as expected.

  6. 𐂀 𐂅 says:

    Computers are just mind amplifiers, work on maximising the developmental potential of the child’s mind, then when they are ready they will be in the best position to decide what details about what technologies best serve their needs. It also pays to have a lot of interesting stuff accessible and have a family culture involving open dialogue about a wide range of topics so that when a topic comes up and the child show some interest you are there for them with some stuff from your hord to show them how it works and why they may find it useful. They may find something interesting and explore it further, sometimes deeply, or just for a while before moving on, at times they surprise you by not being interested at all. You have to let the child guide you when it comes to exploring knowledge because it is their curiosity that motivates them and later in life it contributes to a habit of lifelong learning. The key is to build an environment of opportunity and variety for their minds to expand into.

    1. Neil says:

      “You have to let the child guide you when it comes to exploring knowledge because it is their curiosity that motivates them and later in life it contributes to a habit of lifelong learning. The key is to build an environment of opportunity and variety for their minds to expand into.”

      Thank you for this.

    2. John says:

      Excellent comment.

  7. drenehtsral says:

    I struggle with this question with my own kids as well. I too grew up in parallel with home computers. My first computer was 8-bit and I learned to program it in BASIC and assembly language in elementary school. In middle school I upgraded to 16-bit, and in high school to 32-bit.
    This made each step accessible and comprehensible and intuitive. My children, however, do not seem to take to programming or creating much beyond what can be done in video game level editors and I find myself wondering how much of that is just a difference in personality and temperament vs. how much comes from being overwhelmed by the depth and complexity of the systems involved. The limitations that made gome computers so accessible during my childhood provided both a challenge and a goal that felt within reach: the gap between professionally produced games and software vs. hobbyist games and software did not seem nearly as insurmountable as it does today when commercial offerings are built by enormous teams of developers and specialists rather than by individuals or small teams.
    I worry that today’s children will not have the empowering experience of realizing “hey, I could do that!”.

    1. Maave says:

      If they do like editing games then just try to get more technical with it. I learned a lot of my Windows knowledge by installing my own PC games, modifying, cheating, etc. Makes you feel like a 1337 haxor for editing an ini file

  8. Duh says:

    I’m so far finding this discussion very disappointing. Y’all seem to be looking at this as though computers are tools, or references, or toys, but no one seems to be considering the power these things have for flat-out mis-informing the user… how are y’all teaching your kids to be skeptical about sites’ legitimacy?
    We learned these things as they progressed, too… bootblock virii are just silly and annoying, then those that infected your programs and manipulated your files… also annoying, but nothing like the societal impacts of email takeovers that send messages under your name… or “teaching” websites that are flat-out /wrong/… or “cancer pills” from “canada” for prices folk can actually afford, that are just placebos at best. We grew-up alongside those progressions, learning to recognize and weed through those questionable resources, but now we’ve generations, both old and new, being thrown into that deep end. What’re we doing about this? Not even discussing it, so far, judging by the article and comments so far.

  9. Julian Skidmore says:

    In my opinion, in the same way that we don’t stick kids in front of “War and Peace”, expecting them to excel at reading, because it’s a 1000 times more complex than “The Gruffolo”; nor do we stick children in front of Mathematica, because it’s so much better than a 4 function calculator; it doesn’t make sense to teach programming or understanding computers using modern multi-core, multi-GHz, 64-bit machines.

    You really do need to start at the bottom, on computers that boot in <1s straight into a self-hosted programming language with direct access to some simple graphics and sounds. Everything else is an illusion.

    1. monsonite says:

      Good comment Julian.

      When I started there was barely graphics, and virtually no sound. This encouraged you to use your imagination and ingenuity.

      Booting to a self hosted language within a second might seem anachronistic these days, but I have wasted years of my life waiting for a wretched PC to boot.

      Now you can buy an ESP32 based board that has VGA, PS/2 mouse and keyboard, SDcard and headphone jack which will emulate a whole range of retro machines for about $20.

    2. Winston says:

      Here’s my first “computer.” The included course materials were fantastic:

      https://www.old-computers.com/museum/photos/heathkit_ET3400_System_s2.jpg

  10. Mark Walter says:

    As a baby-boomer I grew up on the cusp. I remember my first computer: had to be early 60’s. It was a mechanical contraption that was activated by pulling/pushing a mechanical slide. There were mechanical bails (wire rods that slide in and out) which caught in turn projections/pegs on the mechanical slide. So by putting pegs in the slides you could for example make the ‘computer’ count.

    I wish I still had that: what a cool toy. Also wish I had saved the “Mr Machine” I got for christmas one year. Wonder what that would be worth today?.
    Mark

    1. Garth Bock says:

      You are talking about the DigiComp from the 1960’s. You can find them on eBay for anything from $250 – 500. My older brother had one and I got to play with it alittle when he wasn’t around.

    2. Garth Bock says:

      I had a Mr Machine as a kid. It got me into trouble because it led me to taking apart other things in the house.

      1. mythoughts62 says:

        I was one of those kids who took everything that sat around too long apart. My parents really started to pay attention when I could put them back together and they worked. Some of them didn’t work when I took them apart. I got my first job as an electronic technician when I was 15.

  11. ian 42 says:

    “it doesn’t make sense to teach programming or understanding computers using modern multi-core, multi-GHz, 64-bit machines.”

    Wrong. It even makes more sense to do it now than ever before.

    My son is 16, and I’ve taught him programming (languages, data, structure, design, debugging) since he was 2. The idea is to get them interested in creating things other people use, not just sitting on a computer using what other people write.

    ie creating instead of just consuming.

    Yes, computers are more powerful, yes you can google and find a program that does 80% of what you want. But there is even more need for people who understand what’s going on and can write good software than there ever has been. And for people who look at https://xkcd.com/2347/ and understand the problem – and can write something that doesn’t depend on 57 libraries…

    So to answer the question in the article – start them at a young age with something visual like scratch then move them into programming in a text based language to do things they want to be able to do… And one day, if they take an interest, they will be able to program just about anything to do just about anything…

  12. Garth Bock says:

    I think some of us are lucky growing up pre-home computer and growing into the technology. I had a AWS protoboard. It had a 4 bit processor, hex keypad, led display and a breadboard to link to circuits you design. Later came a TRS80 and training at NCR tech school where the cash registers were programed in binary and BCD converted to binary.

  13. Bill-R says:

    As a teen in the Mid 70s, liked building electrical devices. Plenty of parts to scavenge from TVs and other devices. From there turned to early 8080/Z80 machines and a adding using a soldering iron. I am still building and using computers, my latest an AMD 3800 and an NVidia gpu running Kbuntu 20.04 – gpu used in Python. I learned to program on those early machines and still love.

    Have 2 teenage daughters that are “digital natives” and yes touch screens are very intuitive to younger children. Computers come natural too then. However, even after starting in my teens after all these decades, I too have went native.

  14. localroger says:

    Kids today aren’t really growing up with computers. They’re growing up with media devices. The nuts and bolts that we had to learn in the 1970’s and 1980’s are abstracted out. I was shocked to learn that a coworker who had been doing many personal projects with Arduinos for years did not know how 2’s complement binary math worked. This is a thing you learned within a few weeks back in the day if you wanted to make the computer do anything useful at all, but when I explained to him that binary integer math is circular and the difference between a signed and unsigned integer is whether you put the zero at the top or bottom of a clock face, his mind was blown. And don’t even get me started on all the bugs I’ve found in industrial crap because they don’t teach people the difference between floats and reals any more.

  15. John Q. Public says:

    How many people actually sit down and write a program anymore?
    Now it’s all apps and internet connected media.
    Every day on every news channel on both TV and radio, you either hear
    get our app or download our app….
    How about just watching the tv or listening to the radio?
    Now you have to have an app? Computers aren’t computers anymore like
    when the C64 and Apple //e were around. How many remember sitting
    many hours on a weekend typing in machine language program from “Compute!”
    to play a new game? Nowadays, there’s no effort. When Doom first came out,
    it was one of the most downloaded pieces of software anyone wanted.
    Some downloaded it via modem which took hours. Nowadays with our gigabit
    networks and lightning fast internet connections you can download a game
    in a few minutes. No effort needed. Unless you’re going into writing software
    as a living, most don’t need to learn to program computers.
    With AI, they’ll probably be programming themselves in the future.

  16. Nuxi says:

    Fortunately, assuming you have a decent school district, many of your concerns should be figured into your child’s curriculum.

    Libraries, and librarians, have changed significantly from what you may remember from elementary school. As a public school librarian, my wife teaches media literacy, including discerning truth from disinformation, internet safety, including topics such as data privacy and malware, and basic coding with tools such as code-a-pillar, ozobot, and hour of code beginning in kindergarten.

    1. Duh says:

      This sounds promising… except, I heard on NPR just two years ago that some entire /states/ don’t even have computer curriculums in their schools, “yet.” 20friggin19. Here’s hoping your librarians’ new roles and expertise are more commonplace!

  17. Jerry says:

    I’ve supplied computers and such for kids and grandkids all along.
    This video is my grandson, from 2018, and shows what a smart kid can do if given the tools.
    I bought him a lifetime membership in MENSA when he was 9. (Right after he did this video)

  18. bolobot says:

    One thing that I think is pretty important to remember, and took me a while to get, is that you’re kids are not you and will find their own things that interest them – the only thing you can really do is expose them to as much as possible and see what sticks in their brains.

    To me messing around with the ZX81, Vic20, Commodore 64 in high school opened an environment where I could be a world builder and now 40+ years later that’s what I’ve been using for most of my adult life to pay my bills and feed my hobbies.

    My kids had their domain names before they were born and an old laptop to play around with before they could walk and even though I taught them binary and basic programming, and built micro-controller based projects with them when they were young, up until now they really aren’t all that interested in doing a deep dive into the nuts and bolts of programming. But, you know, that’s o.k. They’re using Blender, doing video editing, and using the computer as the tool that it has become in ways that didn’t exist when I was kid to build things and their worlds with the things they are interested in. And it’s all good.

  19. Taper Wickel says:

    I cut my teeth on school-district Apple IIs and IBM PS/2s (the 4th-6th grade campus got a lab of Model 50s, the ones all in one box like the early Macs), my grade-school best friend had an Atari 800, and my mom’s parents got Macs by 1986 or so (an aunt was in graphic design and early desktop publishing) — but my first home computer was a TRS-80 Model I, which we got in 1988; it was nine years old at the time. We got an Apple IIgs a year later, but I credit the TRS-80 with giving me the retrocomputing bug very early.

  20. AbsoluteRecoil says:

    The young kids these days I’ve mostly run into are only comfortable within their social media UI. They panic when confronted with a desktop and think they broke something and panic when God-forbid a terminal pops up. The future looks bleak for the new generation.

Leave a Reply Cancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK