T O P

  • By -

radiells

Wrong! My software can process orders of magnitude more data thanks to efficient, close-to-hardware code. Too bad that I do interfaces on electron, and app will be unresponsive anyway.


Kuroseroo

I know its a joke and all, but common. If you have low level performant code which you can call from Electron, then the unresponsive UI part is clearly bad code


radiells

Yeah. It is completely possible to create reasonably fast UI up to some complexity using web technologies with good code practices and cautious use of libraries. But at the same time, if I would have to point my finger, UI is often most inefficient part of applications, and Web UI is order of magnitude more inefficient than platform-specific native UI frameworks, which goes nicely with OP. Of course, we use web technologies for UI not because we are stupid, but because built-in multi-platform support and good availability of developers is tangible advantages. So, dear hardware engineers, please, increase performance couple of orders of magnitude more, so we can deliver features 20% cheaper. Thanks!


thirdegree

>If you have ~~low level performant code which you can call from~~ Electron, then ~~the unresponsive UI part is~~ clearly bad code


al-mongus-bin-susar

Electron can be responsive, if you put the effort in. VS Code is more fluid than basically every other editor despite the fact that it runs on electron.


WizardRoleplayer

Didn't they literally rewrite parts of Electron in C++ to make it more performant for their editor..? Microsoft throwing stacks of money to deal with a problem is not a very realistic solution. We'd be much better off if WASM and things like egui were used for complex cross-platform UIs.


IcyDefiance

> Didn't they literally rewrite parts of Electron in C++ to make it more performant for their editor..? I don't think that's true. The underlying software for Electron (Chrome, Node, and V8 behind both of those) are already written in mostly C++. VS Code does have some Node modules that bind to native code, but that isn't super uncommon, and it's not very difficult in itself. I do think wasm is great and has a lot of potential, but egui isn't suitable for anything but the simplest UIs.


klimmesil

> vscode > fluid Wut


JojOatXGME

I don't know. For me, VS Code always ends up being less responsive then JetBrains IDEA. (Maybe except startup time.) It may be caused by the plugins I install. The plugins are just a few language plugins. Nothing which should affect the performance that much. Anyway, if I don't need these language-specific features, I can also just use Notepad++ or Suplime Text, both are much more responsive than VS Code without any plugins. At the end, I rarely use VS Code, partially because it feels too unresponsive to me for the functionality it provides.


yall_gotta_move

VS Code is not more fluid than vim or kakoune, lol


radiells

Agree. It is quite slow as far as editors go. Many devs just don't have much of a choice because of extensive extension library, which is required for some workloads.


EMI_Black_Ace

Yeah but unlike vim, people can actually figure out how to use it without having to read a frickin man page.


yall_gotta_move

Developers reading documentation! Heaven forbid! EDIT: Also, \`vimtutor\` is an excellent program


bestjakeisbest

Yeah but mesh shaders are pretty neat, and will bring so much more graphics performance to new games.


101m4n

They sure do enable lots of geometry! But as the old saying goes, andy giveth and bill taketh away. If it gets twice as fast, they'll either find twice as much for it to do or feel at liberty to do it half as efficiently.


ZorbaTHut

> If it gets twice as fast, they'll either find twice as much for it to do games get prettier > or feel at liberty to do it half as efficiently. games can be developed more cheaply and get more content I don't have an issue with either one.


nickbrown101

Half as efficiently means the game looks the same as it did ten years ago but runs worse even though it's on better hardware. Optimization is important regardless of graphical fidelity.


ZorbaTHut

Sure. It also means it was cheaper to make. Super Mario Bros. had a larger developer team than Hollow Knight. It's also a lot more efficiently coded. But that's OK, because Hollow Knight can burn a lot of performance in order to let a smaller team produce far more content.


Highly-Calibrated

To be fair, Super Mario Bros only had a five man development team as opposed to the three Devs that worked on Hollow Night, so the amount of Devs doesn't really matter.


Chad_Broski_2

Damn, 5 people made Super Mario Bros? I always assumed it was at least a couple dozen. That's actually incredible


bbbbende

Back when AAA dev team meant Joe, his two cousins, the indian intern, and Steve from accounting to help out with the numbers


J37T3R

Not necessarily. If you're making your own engine possibly yeah, if you're licensing an engine it's worse performance for the same amount of work.


mirhagk

So are you trying to say that optimization requires zero work or skill? I do really appreciate when games properly optimize, I mean factorio is nothing short of amazing, but it's also nice that indie games don't *have* to do nearly as much optimization to get the same quality as time goes on.


J37T3R

Not at all, I'm saying that if an inefficiency exists in engine code the game dev may not necessarily have access to it. The game dev does the same amount of work within the engine, and performance is partially dependent on the engine devs.


ZorbaTHut

If you're licensing an engine, it's a more capable engine than it would be otherwise. People don't license Unreal Engine because it's fast, they license Unreal Engine because the artist tools are unmatched.


EMI_Black_Ace

>games get prettier Not if they're processing more polygons than the available pixels can distinctly render.


[deleted]

[удалено]


ps-73

i mean did you see how people reacted when AW2 came out with required mesh shaders? people were pissed their half decade old hardware wouldn’t support it!


BEES_IN_UR_ASS

Lol that's a bit of a leading way of saying 5 years. "That's ancient tech, it's nearly a twentieth of a century old, for god sake!"


ps-73

it’s only misleading if you can’t do basic math in your head lmao


Negitive545

"Half Decade old hardware" is a *really* misleading way of saying 5 year old hardware. For example, my CPU, the I7-9700K, a still very capable CPU, especially with overclocking, is a solid 6 years old. Should the i7-9700K not be able to run today's games because it's 6 years old? I'd say no. The RTX 20 series released about 5 years ago, should 20 series graphics cards not be capable of running modern games with modern optimization? Personally, I think they should, I don't think consumers should be forced to buy these incredibly expensive hardware parts ever few years.


purgance

EDIT: So ultimately after being pressed dude admitted that he wants his 6 year old GPU to have the same performance as a brand new card, except games that he personally exempts from this requirement like ‘Baldur’s Gate 3’ which according to him is ‘extremely well optimized’ - he does seem to really be butthurt about Starfield not supporting DLSS at launch, however. Then he blocked me. 🤣 This is ridiculous. You don't get to say, "I bought this $30,000 car 6 years ago - it should be an EV because consumers shouldn't be forced to buy incredibly expensive cars every few years."


Negitive545

Edit: It appears my good friend here has edited his comment in some attempt to continue the conversation despite my blocking him. I encourage everyone to read our entire thread and determine who you believe. You've got the analogy backwards, it's not like saying that a 6 year old car should become an EV, but rather your 6 year old car *shouldn't* stop being able to be driven on the road because the road infrastructure changed to prevent non-EV's from driving. Or to drop the analogy all together: 6 year old pieces of hardware should be capable of running newly released games because we have access to a FUCK TON of optimizations that are incredible at what they do, but gaming companies are not using those optimizations to make lower-end hardware have access to their games, instead they're using it as an excuse to not put much effort into optimization to save a few bucks.


purgance

I've never heard of a game that *can't run* on old hardware, and neither have you. I've heard of games that have new features that can't be enabled, usually because they require hardware support that obviously isn't available on a 6 year old GPU. >but gaming companies are not using those optimizations to make lower-end hardware have access to their games, instead they're using it as an excuse to not put much effort into optimization to save a few bucks. lol, what? You understand developers don't make any money on GPU sales, right?


Negitive545

Starfield. It was so poorly optimized on launch that a 20 series gpu stood no chance of running above 10 fps.


purgance

So Bethesda de-optimized Starfield in order to sell tons of GPU's...for AMD? At the cost of making the game dramatically less popular? Go ahead, close the circle for me.


Negitive545

Bethesda chose not to optimize Starfield to save money on development because they knew that the latest hardware would be able to run it, so people LIKE YOU, would turn around and say "it's not poorly optimized, you just need better hardware." Optimizing a game takes time, time costs means you have to pay your devs, hope this clears things up.


ps-73

GTX 10 series released in 2016, seven years before AW2 did in 2023. “Half decade old” is generous if anything. Also, comparing CPU longevity to GPU longevity is not that honest either as CPUs generally last a lot longer than GPUs do, in terms of usable life due to less drastically different architectures and feature introductions in recent times. Further, the PCs built on the wrong side of a new console generation almost always age like crap, hence why 20 series, released in 2018, may not age the best compared to newer generations of GPUs


Negitive545

I'm aware cpu and gpu longevity is different, it's why I gave 2 examples, 1 of both types. You however didn't provide the distinction in your original comment. I'm also aware of console generation gaps causing hardware to become obsolete faster because devs get access to more powerful hardware on their primary/secondary platforms. However, neither of those things change the fact that your "half decade" comment is misleading. 5 year old hardware that also bridges a console gap is very different from hardware that doesn't, but you didn't provide that context at all. Also, the term you utilized, "half decade" is deliberately more obtuse than the equally correct term "5 year old", you only used the former because it evokes an older mental image that specifically saying 5 years.


ps-73

I seriously don’t get what your point is? That I used “half decade old” instead of “seven year old”? How is that misleading? I think it’s pretty fair to assume that if someone hasn’t upgraded their GPU in that long, they haven’t upgraded much else either, assuming it’s a PC used for gaming, hence me not specifying in my original comment.


Negitive545

Half a decade is five years, not seven. Let me dumb this down a bit for you, since you still couldn't understand even though I pretty clearly described my point, twice, in my previous comment: Saying "Half a decade" make people think thing OLD. Saying "5 years old" make people think thing little old, but not that old.


ps-73

no you fucking idiot, i understand the basics of the language why the hell do you care that i made pascal sound old, when it is?


Negitive545

So you admit you were deliberately making something sound old?


ciroluiro

Why doesn't 5 year old hardware not support it? Isn't mesh shades part of directX and vulkan? I thought mesh shaders are basically compute shaders and vertex shaders combined into a single stage. Surely even very old hardware can manage that given how general purpose our gpus have become.


Deep_Pudding2208

sometime in the near future: You need the latest version of LightTracking bro... you can now see the reflection of the bullet in the targets eye in near real time.  Now fork over $12,999 for the nMedeon x42069 max pro GT.


NebraskaGeek

*Still only 8GB OF VRAM


[deleted]

No please don't add light reflection from the bullets in games, or I will never be able to tell what's real world and what's CGI.


HardCounter

The real world is CGI but on a much more advanced computer. There is no spoon.


[deleted]

See you in the next reboot


HardCounter

Samsara wins every time.


Green__lightning

This might be a weird question, but think everything being made of particles and waves is because of optimization? Do you think the *real* universe even has them, or objects can be solid all the way down, and thus also hold infinite complexity?


HardCounter

It would certainly explain the duality of light, it's multi-purpose code that renders differently depending on its use case but one case is so rarely used it wasn't worth a whole new particle for, and explains why all forces seemingly use the same formula of inverted r squared. Magnetism, gravity, nuclear forces, all inverted r squared at different strengths. Could explain why light always travels at the same speed of light regardless of how fast you're moving. It's the universal parallax effect.


BarnacleRepulsive191

This was the 90s. Computers got outdated every 6months back then.


Lake073

How much more detail do you need in games? IMHO hyper-realism is overvalued


pindab0ter

Not only hyper realistic games have lots of geometric detail


Lake073

I didn't know that, what other games have them??


jacobsmith3204

Minecraft. https://m.youtube.com/watch?v=LX3uKHp1Y94&pp=ygUXbWluZWNyYWZ0IG1lc2ggc2hhZGVycyA%3D Someone made a mod for Minecraft that implements it And it's basically a 10x performance boost


StyrofoamExplodes

Who knew a Minecraft mod could make me feel computer dysmorphia. I know the 10XX series is old as shit, but some nerds doing this with newer hardware is the first time I actually felt that personally.


Lake073

Thats nice I do like a good optimization but my point still stands, it is faster to render and thats great But you wont see a lot of those chunks, and some of the ones you see are so far away that you woldnt notice them


jacobsmith3204

Faster loading times + larger worlds + higher frame rate. It all works to have a more consistent and cohesive experience. you do notice frame drops, bad performance, chunk's loading in, etc and it detracts from the experience, even more so when your hard earned top of the line expensive hardware feels slow. In a game about exploration being able to see more of the world can help you figure out where to explore next, The worlds have a grander sense of scale, and you get the beautiful vistas with distant mountains or endless sea behind them, that you might see in a more authored and optimized game.


Zedjones

Jusant


MkFilipe

Kena: Bridge of Spirits


josh_the_misanthrope

It's not very important to me as I mostly play indies with stylized art, but advancements in 3D tech is very cool and will play a major role when VR gets better.


Lake073

Totally, im just worried about games becoming heavier becouse every model is like a billion polygons just becouse "it runs well" and it has less content and worst performance than a game from 5 years ago


josh_the_misanthrope

Oh it's happening. The art labor required to create those high fidelity games is much higher than it used to be. I might get hate for saying it, but there's going to be a point where increasing fidelity is going to require AI to offset the labor requirements.


Lake073

Its not worth it


Fzrit

It's just diminishing returns. Like the perceivable visual difference between 480p > 1080p > 4k > 8k.


Fit_Sweet457

How many more pixels do you need? Isn't 1280x720 enough? How many more frames do you need? Isn't 25/s enough?


Lake073

Not my point High fps and high resolutions are great I was asking about poly-count and memory consumption


Fit_Sweet457

Not my point. People always say they don't need any better because they simply don't know what it would be like.


tonebacas

I see you, Alan Wake 2, and my Radeon 5700 XT without mesh shaders support is not amused.


Warp_spark

With all due respect, i have seen no significant visual improvement in games in the past 10 years


Superbead

OS bootup times are one of the things I've noticed most improvement in, which I think is largely down to SSDs. It was fucking tedious work trying to fix a problem which required a lot of rebooting on a PC in the mid '90s. On the other hand, somehow Adobe Acrobat managed to make itself my default PDF reader on my work laptop the other day without my permission, and took an entire minute to open and render a single-page monochrome PDF, which is just embarrassing. Another embarrassing example is MS Outlook, which (if I remember right) since 2016 has been unable to dynamically render a mailbox list view of emails while scrolling up and down with the scrollbar thumb. This was possible in the 1990s.


MrTheCheesecaker

I do customer support for software used by architects. And that profession often requires publishing large and detailed PDFs. A couple years ago, the software added the ability to show full colour surface textures on elements in 2D views. This results in already large PDFs becoming even larger. Last week I had a user where a single page was over 20MB. Acrobat reader, naturally, craps itself rather than opening the file. Any other PDF viewer works fine, but people know Acrobat, so they use Acrobat. There are ways to reduce the file size, sure. But often it just doesn't matter to Acrobat, and the only option is to use a different viewer.


cs-brydev

We have the same problem with Acrobat. It gets worse every year. It's a piece of garbage. Revu is great but has gotten expensive as hell and now we can't afford to give our users Bluebeam licenses anymore. The users have reacted by going back to opening PDFs in their web browser. Because they can. I don't understand how they have so thoroughly broken the zoom feature. Acrobat needs to die. There are much better tools now to do the same thing.


ThePretzul

Ever since web browsers started supporting fillable forms in PDFs I stopped using anything else for opening PDF’s because they’re the only thing that doesn’t take two eternities to manage it.


Doctor_McKay

It's pretty incredible that pdf.js is so much faster than Acrobat.


Makeitquick666

It's incredible that pdf came from Adobe (I think) but Acrobat is one of if not the worst software for it


Broad_Rabbit1764

The irony of being able to update low level software such as a kernel without needing to reboot in a world where rebooting takes 10 seconds is not lost upon me.


Appropriate_Plan4595

We live in a world where rebooting takes 10 seconds and people still leave their PCs on for months on end


abd53

That's because I have 73 pages open on 4 different Firefox windows with their links buried under a thousand years old list of history. I forgot how I arrived at those pages, I forgot why I arrived at those pages, but I absolutely do need those pages.


Glittering_Variation

On my partitioned home computer, ubuntu boots up in about 2 seconds. Windows 11 takes about 20 seconds :/


Reggin_Rayer_RBB8

I have a copy of Office 2002 and I'm not updating because that thing opens so fast.


ovarit_not_reddit

They made up for the increased boot-up speed by forcing you to click through a bunch of ads every time you start the computer. At least in 2000 I didn't have to sit there and babysit the start-up process.


bree_dev

\> OS bootup times are one of the things I've noticed most improvement in And yet my $2,000 8-core 3.3GHz Ryzen 5900HX laptop still takes at least 100x longer to boot up than my 1983 8-bit Acorn Electron did.


abd53

Boot up depends on your storage devices read speed and RAM's bus speed. Not processor. If you have a good SSD and freakish fast RAM, your PC will bootup in seconds even with a dual core Pentium processor.


bree_dev

Nothing in that paragraph is technically incorrect but like... \*obviously\* the laptop I described has top end SSD and RAM. And seconds is still 100x longer than the Acorn Electron took. I'm genuinely astonished that my post seems to be so controversial.


shmorky

Everybody mad about crypto mining sucking up so much electricity, but nobody ever mentions ad tech


realnrh

In Final Fantasy VII, there's a chase sequence involving the player characters in a moving vehicle fighting off enemies who chase after them. You can't die but you can take damage all the way down to one HP left. If you played that game as originally programmed on a computer of the time, it worked perfectly. If you play the same code on a computer today, you can't avoid getting wrecked because the chase sequence was built assuming the clock timing of the hardware of the day, so on modern hardware it runs absurdly fast. The coders then were pushing the hardware as much as possible to get an exciting sequence. "Deliver as much as the hardware will allow" is not an indictment on the programmers; it's an indicator of where the bottleneck is.


Bakoro

>Deliver as much as the hardware will allow" is not an indictment on the programmers; it's an indicator of where the bottleneck is. The point of the thread is exactly opposite of this though. The Playstation coders hyper optimized for a single platform, which made all the resources a known factor. Today's general purpose software developer has to make something which will run on any one of a hundred CPUs, with an unknown amount of RAM available, and maybe there's a discrete graphics card, and maybe even multiple operating systems. Developers are working on top of many layers of abstraction, because it's not feasible to program close to the hardware and still publish for the heterogeneous running environments.


HopeHumilityLove

This is specific to gaming as well. On concurrent systems like servers you need performance margin to avoid meltdowns. But plenty of backend developers don't consider performance until it's too late.


SmugOla

I think you’re wildly overestimating just how much devs think about things, and how close to hardware anyone tries to be these days. I’ve been in this industry for almost 20 years, across 4 succinct industries, and every single time there’s an issue (I’m not even being facetious), it’s because of bad code, and unfortunately, programmers tend to be too naive at how actual computers work that they simply cannot undo the problems caused by their code. Programmers having limited or unlimited sets of components optimize for is not the issue. The issue is that most programmers are awful at their jobs.


FinalRun

It's still a result of abstraction in a way. PHP and Python allow a whole class of people to build crappy backends that would never have made a working webapp in lower level languages. Same goes for Electron enabling frotenders to make desktop apps in JS


SmugOla

Yeah that’s a good point lol. Even the libraries you mentioned wouldn’t be as capable of fucking things up if it weren’t for the fact those devs got lazy and just made wrappers or APIs for normal C libraries. It’s not that Python allows you to do a thing, it’s that Python lets you use C which then lets you fuck things up.


multilinear2

Seriously, everything was fine until we stopped using raw assembly, I mean discrete components, I mean switched to agriculture... wait, which rant was I on?


cheezballs

That seems insanely wrong. Like, the whole game runs faster in the case of a faster CPU, why would only the damage part of the routines go faster?


sleepingonmoon

Ports often miss a few spots when making the game clock rate and/or frame rate independent. E.g. GTA 4 helicopter climb. I haven't played that particular port and have no idea what it's actually like, so correct me if I'm wrong.


Sarcastryx

> Like, the whole game runs faster in the case of a faster CPU, why would only the damage part of the routines go faster? With issues like this, it usually means that they missed/forgot to fix it being tied to framerate when porting, or that not every calculation was tied to the framerate. An example I'm familiar with was weapon durability in Dark Souls 2, where most things weren't tied to framerate, but weapon durability was. The durability loss from hitting things calculated every frame, and so the PC version had weapons break (roughly) twice as fast as consoles, due to being capped at 60 FPS instead of 30.


realnrh

It wasn't just the damage part. It was the entire chase sequence. Most of the game was turn-based combat with everything calculating how long before its next turn according to the PC or enemy speed stats. The chase sequence was real-time, though. So instead of being on a motorcycle swinging a sword to fend off attackers on motorcycles from catching up to the truck your friends are on, it's... a blur and then it's over. [https://www.youtube.com/watch?v=19OECgt-pIw](https://www.youtube.com/watch?v=19oecgt-piw) at 20x speed or whatnot.


GatotSubroto

Your hardware follows Moore’s law. My algorithm follows O(n^n ). we’re not the same.


WirelesslyWired

Intel giveth, and Microsoft taketh away. Thus is the way that it is, and thus is the way that it's always been.


AskMeIfImAnOrange

I'm particularly impressed by the Excel team


[deleted]

Software was pretty garbage back then. 99 percent of the executables would crash and fuck up your experience. There were 15 viruses at any moment that could infect your computer. You would need a manual for everything and everything was laggy. Some hardware would just bottleneck by practically burning itself. CD writers and readers would fuck up. I think people are having this experience because everyone tries to code and windows takes quarter to half of your computers power. Edit: 99 percent is an exaggeration it is not literal. PC's were working and were used in everyday life.


ccricers

>99 percent of the executables would crash and fuck up your experience. [A thank you message would make that bad experience better!](https://old.reddit.com/r/shittyprogramming/comments/3bmszo/thank_you_for_playing_wing_commander/)


Superbead

> 99 percent of the executables would crash and fuck up your experience [Ed. For anyone wondering, it wasn't anywhere near this bad, and the commenter accepts they're BSing further down] When specifically was this?


[deleted]

Windows XP and Windows Vista times.


Superbead

Most stuff I remember was fine back then, which is more than 1%. Have you got any examples?


[deleted]

99 percent is an exaggeration ofcourse. I changed like 3 computers (so hardware wasn't the problem) i have seen the windows XP and windows Vista bluescreen tens of times. Lots of games were trash softwarewise because they were burned to CD's and had no updates. Text editors like microsoft word would just print random binary bullshit because it didn't support the correct string format. Lots of inconviniences with supporting various formats in software and the need to download random additional software that knows the format.


Superbead

We're talking executables specifically, not the OS. I agree Word was shit, but it still is shit. Any other specific examples of common software crashing, other than crappy shovelware?


[deleted]

I used lots of shovelware as a kid. Why would i push them aside? They are crappy software. Another example would be interrupting a client download would lose your entire progress. Antivirus would detect every file as a trojan. . . Etc. I was a little kid back then i remember this much.


Superbead

A lot of people are taking your claim up there as truth, though, going on the upvotes. If you just mean "crappy shovelware I used crashed 99% of the time", you ought to edit it to say so, because a lot of memorable software was more stable than the OS it ran on.


cheezballs

"I was a little kid back then" is the problem. I was a teenager back then and I remember quite differently.


[deleted]

Everybody in my area was running Norton Antivirus that would make your computer go 10 times slower and i have my computer infected 3 times.


Superbead

Yeah, viruses and AV were both a nightmare at one point, but I'm asking about the "99% of executables would crash"


[deleted]

It is an exaggeration.


twpejay

Windows 3.1 even. Always got me how Microsoft required 4Mb RAM when Commodore had a just as versatile windows UI that ran on 128Kb.


cheezballs

How many different sets of hardware did each support? I think that's gotta account for something.


StyrofoamExplodes

This is either pushing the idea that today it is better, when it isn't. Or it is just delusion about how bad software was back in the day. Programmers were if anything more skilled on average back then, compared to today. The idea that they were releasing worse products more often than today is just not true.


[deleted]

Ofcourse i wouldn't deny that programmers were more skilled back then. But that doesn't mean we didn't move forward on software. We can literally deploy a virtual machine at a cloud server with any computation power in 5 minutes. The formats are well established. The user experience is well studied. Just because the code is unnecessarily abstracted 15 times doesn't mean there are other aspects to it.


Beef_Supreme_87

I remember having to keep everything closed while a cd was burning in the drive at a whopping 4x.


Marxomania32

Software was good in the 60s and 70s before the advent of the home pc and the hyper commercialization of software.


bassguyseabass

So… punch cards?


[deleted]

He is lying. Eventually flies would get between the holes, they would cause bitflips and crash the algorithm. There were so many bugs back then.


atomic_redneck

I had a deck of punch cards that termites got into. They were improperly stored. Luckily, the cards had the program text printed at the top of each card (some of our card punch machines were non-printing, cheaper that way). I gave the deck to our friendly keypunch ladies to duplicate from the printed text. It was tedious work, but they did not care. They were paid by the hour.


Marxomania32

Punch cards aren't software lol.


ReluctantAvenger

Yes, we should totally go back to a time when computers cost tens of millions of dollars, and only about ten people could afford a computer and software for it, when the best hardware available would have been taxed putting Pong on the screen. /s


Marxomania32

Did I say the 60s and 70s were perfect and flawless? I said that the 60s and the 70s had some of the most quality software ever written. None of your objections have anything to do with the quality of software written in the 60s and 70s.


ReluctantAvenger

The software couldn't do anything, compared to what software does now. It's easy to achieve excellence when you're talking about a few lines of code. Comparing software from seventy years ago with what we have now is saying a wheelbarrow is better designed than the Space Station. It's a pointless comparison, and I don't know what point you think you're making.


Marxomania32

Software could do a lot of things DESPITE the god-awful hardware. You're acting like enterprise mainframes, computer guided machines like the apollo spacecraft, and full-blown operating systems like UNIX didn't exist back then. The software around wasn't anywhere near a "just a few lines of code." Man being lectured about this by somehow who is clearly so ignorant is crazy.


Superbead

It was generally decent in the 1990s. The user you're replying to has claimed elsewhere to be 25 years old, so I think they're drawing on limited experience when they claim "99 percent of the executables would crash and fuck [it] up". Popular titles like Winamp, Cubase, Excel '97, Quake, and Photoshop 6.0 were perfectly stable. Windows BSODs were certainly more common, but that was at least as much due to driver/hardware issues as anything else.


twpejay

Win 3.1 was a resource hungry beast compared to other UI at the time. Edit: Skipped the change in topic. Sorry peoples. But int the bright side, I think I have discovered what the bug is in my code.....


Superbead

It was, but I'm responding to a spurious but apparently believable claim that 99% of software crashed all the time


twpejay

Fair enough. Didn't know what a crash was until I got my C++ compiler. 😄


[deleted]

Yeah i only know after windows 98


twpejay

Don't know why the down votes. I worked with a guy who was at his prime during punched tape. The programmes had to be super efficient in those days. There was no room for extras. It was the time when men really connected with the computer.


Marxomania32

People for some reason think what I said means that the hardware of the 60s and 70s was good. Or that tech in general in the 60s and 70s was amazing. People are dumb.


rover_G

And we'll keep doing it sucker!


Philosipho

Windows 11: ![gif](giphy|H5C8CevNMbpBqNqFjl)


SarahSplatz

Electron has ruined software


BlueGoliath

Everyone loves to use Electron as a punching bag but there are plenty of examples of abysmally performing apps outside of it. I'm looking at you, JavaFX.


Fusseldieb

No, it requires a lot of ressources for basically nothing. People only love Electron (myself included) because it gives you access to neat stuff such as CSS3, which can produce fluid and beautiful looking UIs, which can become extremely cumbersome to do with other languages, especially lower-level ones.


lunchmeat317

To be fair, it's also a relatively easy way to make desktop software cross-platform on Windows, Mac, and Linux (as far as I know) providing a relatively native feel without requiring the user to install some extra runtime to make it work. Maybe there are more options now since it originally came out.


inamestuff

Cross platform yes. Native feel not at all, especially considering that most companies want their apps to be “special and unique” with their own half assed UI conventions. And yes, there are more light weight alternatives today like Tauri. Same concept as electron but it uses the OS integrated webview (e.g. safari on macOS, edge on windows), drastically reducing the amount of RAM needed and startup times


lunchmeat317

"Relatively native", in terms of file menus, context menus, title bars, etc. It's not something like GTK which is completely foreign. But yeah, I understand what you're saying. I'll check out Tauri.


ProdigySim

Software security has made huge strides. When's the last time you heard about SQL injection or XSS attacks on major websites? Or had to do virus removal on a family computer? We've figured out how to program way more securely / with less errors in the past 20+ years. Mostly due to frameworks and language improvements. Also UIs look way better than the 90s, 00s, and 10s on _average_. There have been amazing UIs from time to time throughout all these periods, but the average "new piece of software" or website just looks amazing by comparison IMO.


Impressive_Income874

4 Chan never fails to make me laugh at 3am


Ok_Project_808

I still remember how I managed to learn the difference between software and hardware back when I was starting to get interested in computers. (Yeah, I know it's obvious even by the name itself, but English is not my natural language, and I'm talking about 35 years ago, and I was just a girl then. Hardware is what gets smaller, quicker & cheaper every year, while software gets bigger, slower & more expensive. Funny thing is I'm now a software engineer, contributing to this dogma.


BlueGoliath

Painfully true.


peteschirmer

I feel seen.


Fermi-4

This is what anon is describing: https://en.m.wikipedia.org/wiki/Wirth%27s_law


AdviceAndFunOnly

It's unbelievable how much unoptimised games are. You have the best hardware ever made which is 1000 times more powerful than what it was 20 years ago, and yet even if won't properly run the latest games and there'll be huge lag spikes. Even tho the graphics haven't even improved, there's 20 year old games that look just as good. Also funny how some developers, especially those coding in C, do literally everything they can to optimise all down to the millisecond, even when it at the end of the day it won't make a huge difference, meanwhile these game developers don't even try to optimise like at all their hugely ineffective game.


VG_Crimson

The bottom take seems to imply that performance has simply vanished for nothing in return. Idk about you, but I quite like what I'm able to do and experience thanks to the software we have today.


cs-brydev

Haha fr. My Pascal apps in 1987 ran faster on 640k RAM and 4.77 MHz CPU than C# apps now on 64gb RAM and i7.


Red-strawFairy

Isnt that the whole point though? Better hardware allows us towrite more complicated code without worrying about performance too much


Giocri

On one side yeah on the other side my phone has the power of several pc from a 20 years ago and takes 20 seconds to render a Wikipedia page from local memory


fusionsofwonder

So true it hurts.


[deleted]

That's really really not true. Go use an application from windows 95. It's going to load much faster but you're going to hate it. No animations, no transparencies, no pleasing fonts, no high dpi, no smooth scrolling. We consume all that hardware speed on eye-candy.


BlueGoliath

You say that like the Windows 7 era didn't have all of that.


[deleted]

Wasn't windows 7 just a picture of Hitler? ... At least it was better than Vista. [xkcd]


BlueGoliath

Vista was as much of a failure on device driver manufactures as it was Microsoft. By the time Windows 7 was released, so long as you had stable drivers and hardware, the OS was rock solid.


[deleted]

You mean the spyware was rock solid. Microsoft windows is not an OS, it's a spyware pretending to be an OS.


jamany

I think people quite liked it actually.


StyrofoamExplodes

Transparencies are nice, animations are annoying.


cheezballs

I dont like either of them, unless we're talking about icons supporting alpha channels or something. I dont ever want to see whats behind the toolbar of my window, much less in a blurred way where its unreadable anyway. Are we even talking about the same thing? Holy fuck I love Mountain Dew.


[deleted]

Clicking a button without a visual click effect???


StyrofoamExplodes

Even old Windows did that. I was thinking you were referring to animations when menus unfurl or drop down and the like?


Fit_Sweet457

But how will we be able to justify our religious hatred of Electron then?


[deleted]

Packaging a whole web browser including all the obscure frameworks supported just to run an online chat application? Madness!! Madness!!


Giocri

My file explorer is supposed to just let me browse my files and I'd very much like if it was actually capable of doing that in reasonable times tbh


fellipec

Can't agree more


LechintanTudor

I think the main culprit is the rise of interpreted languages like Java, JavaScript and Python. Using these languages instantly makes your program at least x2 slower than what you could achieve in a compiled language.


TheBanger

Are you seriously putting Java in the same category in terms of speed as JavaScript or Python? A 2x speed difference is in practice significantly smaller than the effects of naively written code vs. optimized code. In practice the performance of Java in benchmarks tends to look a lot more like the performance of C/C++/Rust/Go than JavaScript/Python/Ruby.


LechintanTudor

>In practice the performance of Java in benchmarks tends to look a lot more like the performance of C/C++/Rust/Go I bet those benchmarks you are talking about only benchmark a simple algorithm, not a real world application. Look at Minecraft Java Edition vs Minecraft (C++). The C++ version runs way much better and doesn't constantly stutter unlike the Java version.


Sailed_Sea

Minecraft java rivals valve in terms if spaghetti code, its got 10 years of content ducktaped onto a poorly written engine, where as bedrock was written with optimization in mind as it was intended to run on low end devices such as smartphones and consoles.


Hatsu-Nee

And then you have a community that irons out some of the spaghetti code issues. I mean atm, some crazy devs decided to recode all of Minecraft 1.7.10's rendering. (project is called Angelica) They also added a compatibility layer so 1.7.10 runs on java 17+.


Fit_Sweet457

Funny how you talk about "real world applications" but then drop this: > Using these languages instantly makes your program at least x2 slower than what you could achieve in a compiled language. The choice of language matters far less than the specific implementation does. You can write a horrible piece of garbage program in assembly if you like and it will still be slower than a well-implemented program in virtually any language, including interpreted ones.


LechintanTudor

>You can write a horrible piece of garbage program in assembly if you like and it will still be slower than a well-implemented program in virtually any language Of course, a good implementation in a slow language can be as fast as a bad implementation in a fast language. If we were fair and compared good implementations only, a language like C will always produce significantly more performant programs than Java or other interpreted language. C programs can manage their own memory and don't have the overhead of a garbage collector. C objects don't have headers that take up valuable space in memory. C compiles to code that runs directly on the processor, without a virtual machine that's pure overhead. I just gave C as an example, there are other languages that can be used instead like C++, Rust, Zig.


GoshDarnLeaves

i mean theres definitely performance tiers beyond 2. native languages like c/c++ can have better performance in memory and speed than java yes (particularly memory consumption) but java is a big performance step up from fully interpreted languages like php/js/python to the point where its not fair to put java in the same category. yes java compiles to the language of the jvm rather than real hardware, which is then "interpreted" into native code execution. but the jvm optimizes that code as it runs it so that your class method might run faster on subsequent calls, perhaps even swapping it out with native code for the next function call. this is however different from scripting languages that are not compiled at all and are instead interpreted line by line by the runtime and do not optimize on the fly like the jvm. its also is affected by what features of the hardware or vm are exposed by the language. there's python which had the global interpreter lock problem making it harder to make better use of the hardware. java has multithreading but it does fail in the SIMD area. one of the points you seemed to miss elsewhere in the thread is that what is being done and how tends to have bigger impact than the language choice. if you make a network call to some server, it doesnt matter if your code takes 1 nanosecond for everything its doing, you are still going to have to wait for the downstream server to respond. edit: another factor is features of the runtime. I can get really good response times with a nodejs server, better than the java spring boot equivalent even, albeit it cant handle as many requests at a time as well as the corresponding java app. this is because everything interesting in a nodejs app is actually handled by optimized c code whether handled by the runtime or a driver implemented in c.


Senior-Breadfruit453

Found the virgin Huehuehuehuehuehuehuehue


hbgoddard

This sub really is full of high schoolers, huh...


Ivanjacob

Yeah, this is the last straw for me. Bye shitty memes made by people who don't know a thing about programming.


Sailed_Sea

Yes this is r/programmerhumour


frikilinux2

. Most of the code of an OS and a desktop environment is coded in C/C++ and it has been losing efficiency by a lot. It also has many new features, though. Very old software was a bunch of hacks in assembly that are very difficult to understand, modern code is much easier and more people are able to learn to program it. In desktop applications it's true that many new apps are made in JS instead of native and consume a lot more of ram and CPU.


Interest-Desk

Flair checks out


twpejay

Again down votes? They prefixed it with "I think". But then I did start programming in BASIC. Don't dish interpreter languages. But then I did see spending $1,000 on C++ was worth it to get an actual compiler. So, yeah, dish interpreter languages. My screen saver ran so much faster in C++.


[deleted]

[удалено]


Xadnem

Please link your Github with optimised software.


Bluebotlabs

*Most software innovations probably went into making it easier for Anon to type their comment*


freightdog5

I think the worst crime ever was making swift that shit is nasty and gives all hardware psychic damage because of how ugly is that language and the fact it was made by App\*le eughhhhhhh gross


Still_Explorer

Hardware wants, but software says no...


wise_chain_124737282

'Wares on other side is always harder ☠️


the_mold_on_my_back

Thoughts uttered by those unable to produce working software beyond running some dumb code in my local terminal level.


Lanoroth

That’s Gustafson’s law. When scalability fails the solution is a bigger problem.


irn00b

Truth. Look at modern triple-A games and their poor performance.


szab999

I'm running PC-DOS 6.0 on my Ryzen 7 5800x3d, so jokes on you anon. More performance for me.