3

Fellow crustaceans, 2023 will be the year of ${TECHNOLOGY}

 1 year ago
source link: https://lobste.rs/s/ef3rhw/fellow_crustaceans_2023_will_be_year
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Fellow crustaceans, 2023 will be the year of ${TECHNOLOGY}

Following up on my post from two years ago:

What technology will “come into its own” in the coming year? Is 2023 the Year of the Linux Desktop? Will Rust replace all other programming languages? Will Copilot make us all obsolete? Will DALL-E become sentient? Is Fortran ready for a renaissance?

What technology is going to be the one to know in 2023 and beyond, in your learned opinion?

(This question is intentionally open-ended, in the interest of driving discussion.)

  1. AI being used in inappropriate ways

    1. Are there inappropriate ways to use AI? Ineffective, maybe

      1. zk

        10 hours ago

        | link

        Figured they meant unsavoury, malicious, ways.

      2. gwn

        2 hours ago

        | link

        Every tool can be used for inappropriate motives. I believe this is not controversial. If this is not sarcasm and you are serious, I recommend you to reevaluate your position.

      3. kellogh

        10 minutes ago

        | link

        A lot of these major models come with model cards, that describe appropriate and inappropriate usage, how it was trained, etc.

  2. threkk

    13 hours ago

    | link

    There will be a new JavaScript framework.

    1. Only one? Very optimistic

      1. We can only pray.

  3. I really think Rust will become “the” systems language for new project starts, if not in 2023 than in 2024, if it doesn’t have some sort of spontaneous catastrophic failure somehow.

    I think there’s going to be a renaissance in on-premises computing, and while The Cloud will continue to grow, there will be more people running their own stuff and/or smaller datacenters will come back into fashion, at least a bit.

    I think Android is going to fall below 40% market share in the United States and maybe elsewhere.

  4. l0b0

    edited 12 hours ago

    | link

    2020 was the year of the Linux desktop, when most Steam games could be played by simply enabling Proton.

    2021 was the year of Nix, bringing the first “easier to read than to write” configuration language to the masses (well, at least to the millions maintaining *nix machines).

    2022 was the year of Linux gaming, when the Steam Deck was released, solidifying Linux’s position as a good gaming OS.

    2023 is hopefully the year of HDR (10+-bit colours) on Linux. And the start of the next AI winter, as people realise that a statistical language model is neither necessary nor sufficient for “intelligence” (whatever that is).

    1. ahelwer

      edited 10 hours ago

      | link

      I think you’re right that the era of the Linux desktop is unironically here. A lot of devs are gamers and so have windows on their home machine. Before, you had to deal with the mess of dual-booting (which basically requires you learn how UEFI works) so you could boot back and forth between Linux and Windows for projects and gaming. So a lot of people just didn’t deal with it, and kept using Windows. Now Windows is growing steadily more abusive of its users in ways that even devs can’t hide from, and Linux can play nearly all of their games. The list of reasons to stay on Windows, even for dual booting, is growing pretty small. Some similar logic is probably applying to non-devs. I’m not exactly a trend-setter and I finally switched over from Windows to Linux this year.

      I don’t think you’re right about AI winter, though. ChatGPT made it clear that the style of internet search that has reined since like 1995 is on its way out (along with everyone complaining how search results have gone to crap). It won’t be long before much of peoples’ non-social-media internet access is mediated by a language model like ChatGPT.

      1. l0b0

        9 hours ago

        | link

        I don’t think you’re right about AI winter, though. ChatGPT made it clear that the style of internet search that has reined since like 1995 is on its way out (along with everyone complaining how search results have gone to crap). It won’t be long before much of peoples’ non-social-media internet access is mediated by a language model like ChatGPT.

        Oh, I completely agree ChatGPT will be incredibly annoying, comparable to the email spam problem but for the entire web. Just that we’ll get an AI winter once people realise it’s not very good at basically anything useful, only to generate statistically plausible text.

        1. Just that we’ll get an AI winter once people realise it’s not very good at basically anything useful, only to generate statistically plausible text.

          Thing is, generating statistically plausible text can be incredible useful. Not just for creating more spam, but also for actual productivity. For search I would say it is almost completely useless, as you have to go out and verify for every little details it gives you whether it is actually correct or not. It can be helpful to get ideas on what to search for though.

          Where it shines, for me, is it’s ability to generate and mangle text. Over the last days I have used ChatGPT to write technical documentation and review comments. If you look at the end result, about 80% of the actual content was produced by me, but putting it in a nice readable text still saved me significant time.

          People are still figuring out how to use it. Even if there is an AI winter coming for the research part, there will be a lot of activity to transform the current AI into usable tools in 2023.

  5. WebAssembly in more places.

    AI fears.

    I hope zig starts beating rust as a systems language this year.

    1. I dont think rust and zig really compete, do they? Zig is a “by hand” memory management language

      1. I don’t understand under what circumstances I would chose zig if I had already learned Rust, so I see them as competitors.

        1. x64k

          2 hours ago

          | link

          This is probably adding on the competition bit: I know Rust and I am looking at Zig. I doubt my ability to write good, fast code in a language that’s as huge as Rust. I also feel that “knowing” Rust isn’t something you do passively, it’s basically a part-time job. It’s not one that I find particularly rewarding, as language design is neither a hobby of mine, nor something I’m professionally interested in, and it takes up a lot of time that I would much rather spend writing useful programs.

    2. Nah, Zig needs to at least do 1.0 (well, as an alternative, Rust can do 2.0) to start to dream about outcompeting Rust :P

    3. +1 for Zig!

  6. rcoder

    11 hours ago

    | link

    We’re gonna have to figure out how to make software that actually runs locally again. The public cloud is too expensive, complicated, and hard to operate safely for most projects and teams, and desktop and mobile apps are increasingly unable to function without broadband. (Don’t even get me started on the entirely-standard, deeply-integrated “analytics” hooks mining every shred of valuable data they can on each and every person to touch most apps/services/sites.)

    For me, that means working on smaller things that work independently, and finding ways to repurpose and scale down tools that are normally used to go the other direction. (Tracing tools and load-balancing proxies, for example, are both equally useful for squeezing down onto smaller and smaller hosts as they are for scaling out to giant clusters.)

    I’m also thinking a lot about how to make self-hosting more accessible and realistic for folks who don’t already build software all day. Nextcloud, Yunohost, and other distributions of OSS services are a good start, but we really need simpler, better-supported tools for messaging, data management, and publishing that can actually run on the computers people own and have access to.

  7. That comment predicting the popularization of federated social media was right, though I don’t think anyone could have predicted how.

    1. Speaking of, if you’re the same @briankung, I’m the same @lorddimwit. :)

  8. 2023 is the year of clever people manipulating stable diffusion in clever ways (such as the audio variant recently shared). Everything else is pretty much secondary.

  9. I wouldn’t pin to 2023 as a year in particular, but:

    • deno – from the technological point of view, it delivers a scripting environment which is hermetic by default, which is huge. Glue “language” where you don’t need to spend hours making the glue itself work is something we sorely miss.
    • https://djot.net – doesn’t yet fully exist, but it is shaping to be (hat tip @Sophistifunk) the well-thought-out, well-supported, properly specified solution we’ve all been waiting for.
  10. High-Performance Computing will make a rapid and completely inexplicable comeback.

    1. A good friend of mine is a professor of HPC and I think you’re right.

  11. cor

    2 hours ago

    | link

    Nix getting adopted by more devs

  12. Maybe “the year of Matter”? The new local-first open standard for home automation & IoT landing at the end of 2022 might mean a bunch of open software & hardware pushing what’s possible, while being privacy preserving and still having the ability to interop, if desired, with the legacy HomeKit / Alexa / Nest ecosystems.

  13. KiCAD has taken long strides this year, and I think it will finally rise to prominence in professional use during 2023. I started using KiCAD about 5 years ago for hobby projects, and the latest major update (6.0) seriously improved the usability and aesthetics of the interface. When KiCAD has features truly on par with the likes of Altium and EAGLE, the paid software that costs thousands of dollars per license per year just won’t be able to compete.

  14. abstract777

    edited 5 hours ago

    | link

    A technology ‘coming to its own’, like something that starts climbing or dominating the charts of StackOverflow annual surveys and the average corporate developer (Java coders and a like) or intended users, feels pressured or are motivated to try it? More bottom up. Or like when business managers will actually push the tech from the top-down in their organization? Also is it end-user facing? Or infrastructure (less fan-fare and less prone to investor mania)?

    Nix (bottom up) and things like ChatGPT and CoPilot fit the criteria one way or the other, except there is lacking a risk-off funding environment or infrastructure stuff generally doesn’t get as much press. So CoPilots and ChatGPTs type of things if I had to pick, but the big push by capital allocaters - just funding anything that breaths and moves related to aforementioned - is not going to happen until the financial markets begin to roar again.

    We’re at, I believe, the very beginning of a Nix super-cycle where all sorts of technology will be implemented on top of it and towards more user-friendly, UI oriented approach to manage or use it. I think it will be bigger than ChatGPT stuff in a way, but more behind-the-scenes where it becomes ubiquitous but generally unknown by the public. Infrastructure. Think Nix has hit critical mass and will be rolled out, in some form or fashion, quietly and strategically by larger and larger organizations globally, with a feedback loop as more Nix dev-friendly services and tools are built.

  15. I have a feeling that in 2023 the decentralised/federated approach will come into its own, now that Mastodon is becoming bigger due to Twitter’s decline in popularity and Matrix/Element are starting to shape up and may become more suitable for casual users. This may result in bigger mindshare for federated systems in general, which leads to new systems being built in this style.

    1. I think it it keeps growing to be a competitor the big companies in that area actually have to worry about then there will be some form of Embrace, Extend, Extinguish.

      Would be cool though if it was otherwise this time.

  16. eBPF and io_uring. By now they are very powerful technologies, but quite complex to master and a bit immature from a tooling and ecosystem perspective.

    And zig :)

  17. WASM and WASI.

    I have hopes for WASI. Standardized Interfaces (be it actual standards of de facto standards by everyone using it) tend to result in a lot of innovation, because good (and bad) ideas have a lower barrier to being applied in practice/production.

    But maybe it takes a bit longer for wide spread use. However I do think it’s reasonable to assume that important cornerstones of the ecosystem will be created and/or find adaption in 2023.

    Encodings. I think we’ll see a bit of a shift in wide spread adoption of various encodings as there have been quite a few new ones and winners and losers are being determined “by the market” so to speak.

    In terms of a lot of things labeled AI, I think we’ll see a lot of disillusion and interesting ways they get fooled, hacked, tricked. And situations where the statistical nature shines through more and more.

    I’m really curious about what laws regarding AI, copyright infringement will bring, because it might (or might not) affect copyright law at large.

    While not a technology itself I also expect a rise in vendors trying to lock-in customers more again, especially through means of SaaS. I think both the economical situation and how the industry works at the moment are leading towards that. Especially companies and people trying to reduce cost. Maybe the Fediverse example also scares some companies and reminds them that people could leave.

    I actually believe more in disillusions and crashes. From AI to Web 4.0, programming languages, databases, maybe parts of cloud computing, smart homes (and smart devices at large), JavaScript frameworks older than three months.

    Not necessarily saying that all things are necessarily bad, but things are hopelessly over over-hyoed and over-marketed. That’s true for a lot of technologies and even when they are good that leads to them being used for things they weren’t built and designed for. Just someone seeing an opportunity to bend it to kinda fit another use case and then the problems begin.

    And of course like fashion trends there will be a lot of “next big things” that we’ll have forgotten about in two years.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK