nickjj a day ago

I'm running a box I put together in 2014 with an i5-4460 (3.2ghz), 16 GB of RAM, GeForce 750ti, first gen SSD, ASRock H97M Pro4 motherboard with a reasonable PSU, case and a number of fans. All of that parted out at the time was $700.

I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.

I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.

For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.

  • Cerium a day ago

    Don't worry, if you are happy with those specs you can get corporate ewaste dell towers on ebay for low prices. "Dell precision tower", I just saw a listing for 32gb ram, Xeon 3.6ghz for about 300 usd.

    Personally, at work I use the latest hardware at home I use ewaste.

    • silverquiet a day ago

      I got a junk Precision workstation last year as a "polite" home server (it's quiet and doesn't look like industrial equipment, but still has some server-like qualities, particularly the use of ECC RAM). I liked it so much that it ended up becoming my main desktop.

      • dotancohen 19 hours ago

        My main desktop is temporarily a Dell server from around 2012 or so. Two were thrown out, each with two 2 GiB sticks of RAM, so I poached the other machine's RAM for a grand total of 8 GiB. I also threw in a small SSD for the / partition (/home is on the old HDD). The thing is dirt slow but I never notice, even YouTube video playback works fine. Even on hardware well over a decade old, Debian runs fine.

    • vondur a day ago

      Ha, I bought one of those for $500 from Ebay. It's a dual Xeon Silver workstation with a Nvidia Quadro P400 8GB, 128GB RAM and 256G SSD. I threw in a 1TB SSD and it's been working pretty well.

      • Forgeties79 a day ago

        What are the limitations of machines like these?

        • snerbles a day ago

          I too have a crippling dual CPU workstation hoarding habit. Single thread performance is usually worse than enthusiast consumer desktops, and gaming performance will suffer if the game isn't constrained to a single NUMA domain that also happens to have the GPU being used by that game.

          On the other hand, seeing >1TiB RAM in htop always makes my day happier.

          • fooker 17 hours ago

            Any pointers on how to buy one?

            • snerbles 16 hours ago

              Personally I use eBay and find the most barebones system I can, then populate the CPU+RAM with components salvaged from e-wasted servers. There are risks with this, as I've had to return more than one badly-bent workstation that was packed poorly.

              ---

              So the Dell Precision T7920 runs dual Intel Scalable (Skylake) and has oodles of DIMM slots (24!), but you'll need to use a PCIe adapter to run an NVMe drive. FlexBays give you hot-swappable SATA, SAS too but only if you're lucky enough to find a system with an HBA (or add one yourself). But if you manage to salvage 24x 64GB DDR4 DIMMs, you'll have a system with a terabyte-and-a-half of ECC RAM - just expect to deal with a very long initial POST and a lot of blink codes when you encounter bad sticks. The power supply is proprietary, but can be swapped from the outside.

              The T7820 is the single-CPU version, and has only 6 DIMM slots. But it is more amenable to gaming (one NUMA domain), and I have gifted a couple to friends.

              If you're feeling cheap and are okay with the previous generation, the Haswell/Broadwell-based T7910 is also serviceable - but expect to rename the UEFI image to boot Linux from NVMe, and it's much less power efficient if you don't pick an E5 v4 revision CPU. I used a fully-loaded T7910 as a BYOD workstation at a previous job, worked great as a test environment.

              Lenovo ThinkStation P920 Tower has fewer DIMM slots (16) than the T7920, but has on-motherboard m.2 NVMe connectors and three full 5.25" bays. I loaded one with Linux Mint for my mother's business, she runs the last non-cloud version QuickBooks in a beefy network-isolated Windows VM and it works great for that. Another friend runs one of these with Proxmox as a homelab-in-a-box.

              The HP Z6 G4 is also a thing, though I personally haven't played with one yet. I do use a salvaged HP Z440 workstation with a modest 256GB RAM (don't forget the memory cooler!) and a 3090 as my ersatz kitchen table AI server.

              • 71bw 5 hours ago

                    >and a lot of blink codes when you encounter bad sticks
                Which sadly happens quite a lot with ECC DDR4 for whatever reason.

                    >If you're feeling cheap and are okay with the previous generation, the Haswell/Broadwell-based T7910 is also serviceable
                
                The T5810 is a known machine, very tinkerable, just works with NVMe adapters (they show up as a normal NVMe boot option in UEFI) and even have TPM 2.0 (!!!) after a BIOS update. Overall, they are the 2nd best affordable Haswell-EP workstations after the HP Z440 in my opinion.

                    >E5 v4 revision CPU
                They are less efficient than V3 CPUs due to the lockdown of Turbo Boost, but then again on a Precision you'd have to flash the BIOS with an external flasher regardless to get TB back.
                • snerbles an hour ago

                  Forgot about Dell gimping Turbo Boost on that firmware.

                  Another route is the PowerEdge T440 (tower server), which does respect Broadwell-EP turbo logic without a reflash. Not quite as quiet as a workstation, though.

              • Forgeties79 15 hours ago

                Man this so much incredible information in one comment

        • t0mas88 a day ago

          Power usage is the main limitation of using these as a home server. They have a high idle power use.

          • trollbridge a day ago

            One of the reasons I use these is because it’s cold half the year and it’s not hard to basically use to supplement the heat.

        • wittjeff 7 hours ago

          I bought a Dell Precision 7910 2x Xeon E5-2687W v3 (10 cores, 20 threads each) with 32GB RAM and 512GB SSD for $425 including shipping. I found that Windows 11 Pro will recognize only 20 of the virtual cores/threads. I don't feel a need to upgrade to more expensive Microsoft OSs at this time, so I just run Ubuntu natively on that box, which recognizes all of it. Assuming used DDR4 RAM returns to more reasonable prices at some point, I intend to load that box up to the 768GB max.

        • bri3d 21 hours ago

          Very bad performance per watt and higher maintenance needs. Bad performance per watt generally means a larger formfactor and more noise as well.

        • vondur 17 hours ago

          The CPU is far less powerful than a single Ryzen chip from now and the new system is far more power efficient. No super fast USB connections like a new system has.(It does have a USB-C 10GB connection though) Overall if you can live with a bit older machine, it's pretty decent.

        • arprocter a day ago

          On Dell you'll probably be stuck with the original mobo, and their SFFs don't take standard PSUs

          • sevensor 21 hours ago

            In favor of their SFFs, they get retired 10k at a time, so you might as well pick up a second one for spares.

            • arprocter 21 hours ago

              Not a bad call, although you'll probably need to upgrade the PSU to add a GPU (if you can find one small enough to fit the SFF case)

        • MrVitaliy a day ago

          Just performance when compared to current generation hardware. Not significantly worse, but things like DDR4 ram and single thread performance show the signs of aging. Frankly for similar $$$ you can get a new hardware from beelink or equivalent.

          • Forgeties79 a day ago

            Got it so basically it's one of those things you do if 1) the project interests you and/or 2) you get one dirt cheap and don't have high expectations for certain tasks

    • trollbridge a day ago

      I have some Dell server with dual Xeons and 192GB RAM. It is NUMA but that’s fine for Docker workloads where you can just associate them with a CPU.

      The RAM for that is basically ewaste at this point, yet it runs the workloads it needs to do just fine.

      • fooker 17 hours ago

        Where can I buy something like this for a reasonable price ?

        • trollbridge 3 hours ago

          I got it off of eBay, although I paid a lot more than other people do.

    • cestith a day ago

      At home some of my systems are ewaste from former employers who would just give it to employees rather than paying for disposal. A couple are eBay finds. I do have one highish-end system at a time specifically for games. Some of my systems are my old hardware reassembled after all the parts for gaming have been upgraded over the years.

    • agapon 3 hours ago

      Used RDIMM / LRDIMM prices have also started going up, quickly.

    • ge96 a day ago

      Optiplex's used to be my go to the SFF, I had a 1050ti in there not crazy but worked for basic gaming

    • gpderetta a day ago

      surely these will soon be scavenged for ram? Arbitrage opportunity?

      • legobmw99 a day ago

        If they’re DDR4 (or even DDR3), it has no value to e.g. OpenAI so it shouldn’t really matter

        • noboostforyou a day ago

          But it's a cascading effect, OpenAI gobbled up all of DDR5 production to the point that consumers are choosing to upgrade their older DDR4 systems instead of paying even more to upgrade to a new system that uses DDR5. As a result, DDR4 ram is at a new all time high - https://pcpartpicker.com/trends/price/memory/

        • auspiv a day ago

          DDR4 prices are up 2-6x in the last couple months depending on frequency. High end, high speed modules (e.g. 128GB 3200MHz LRDIMM) are super expensive.

          • legobmw99 a day ago

            Isn’t that due to different reasons (like the end of production for older standards)? I recall the same happening shortly after manufacturing for DDR3 ceased, before eventually demand essentially went to 0

            • agapon 3 hours ago

              Even RDIMM / LRDIMM prices have recently started going up. And I thought that those would be safe, because neither "big AI" nor regular consumers need them.

        • jl6 a day ago

          Demand spills over to substitutes.

        • gpderetta a day ago

          The price of DDR4 is also going up!

    • zzzeek a day ago

      ive dealt a bit with ewaste kinds of machines, old Dells and such and have two still running here, the issue is they use a crapton of power. I had one such ewaste Dell machine that I just had to take to the dump it was so underpeforming while it used 3x more power than my other two Dells combined.

  • kube-system a day ago

    > I've never been more fearful of components breaking than current day.

    The mid 90s was pretty scary too. Minimum wage was $4.25 and a new Pentium 133 was $935 in bulk.

    • tossandthrow a day ago

      If you were in minimum wage jn the 90s your lifelihood likely didn't rely on Pentium processors.

      Also, it is frightening how close that is to current day minimum wage.

      • kube-system a day ago

        I was an unemployed student then -- a generous family member gifted me my first Windows PC, and it cost about the same as a used car.

      • briffle a day ago

        Yep, I had a Cyrix processor in mine during that time. Slackware didn't care.

        • pixl97 a day ago

          It also worked as a very good space heater.

      • silisili a day ago

        1990-1997 averaged >4% yearly compounded minimum wage hikes, which is probably about where it should have been. The late 90s to today has been <1.25%.

      • immibis a day ago

        If you account for inflation it's actually higher than current minimum wage.

      • adventured a day ago

        Except nobody earns the minimum wage today, it's less than 1/2 of 1% of US labor.

        The median full-time wage is now $62,000. You can start at $13 at almost any national retailer, and $15 or above at CVS / Walgreens / Costco. The cashier positions require zero work background, zero skill, zero education. You can make $11-$13 at what are considered bad jobs, like flipping pizzas at Little Caesars.

        • jfindper a day ago

          >You can make $11-$13 at what are considered bad jobs, like flipping pizzas at Little Caesars.

          Holy moly! 11 whole dollars an hour!?

          Okay, so we went from $4.25 to $11.00. That's a 159% change. Awesome!

          Now, lets look at... School, perhaps? So I can maybe skill-up out of Little Caesars and start building a slightly more comfortable life.

          Median in-state tuition in 1995: $2,681. Median in-state tuation in 2025: $11,610. Wait a second! That's a 333% change. Uh oh.

          Should we do the same calculation with housing...? Sure, I love making myself more depressed. 1995: $114,600. 2025: $522,200. 356% change. Fuck.

          • reissbaker 20 hours ago

            This will probably be an unpopular reply, but "real median household income" — aka, inflation-adjusted median income — has steadily risen since the 90s and is currently at an all-time high in the United States. [1] Inflation includes the cost of housing (by measuring the cost of rent).

            However, we are living through a housing supply crisis, and while overall cost of living hasn't gone up, housing's share of that has massively multiplied. We would all be living much richer lives if we could bring down the cost of housing — or at least have it flatline, and let inflation take care of the rest.

            Education is interesting, since most people don't actually pay the list price. The list price has gone up a lot, but the percentage of people paying list price has similarly gone down a lot: from over 50% in the 90s for state schools to 26% today, thanks to a large increase in subsidy programs (student aid). While real education costs have still gone up somewhat, they've gone up much less than the prices you're quoting lead you to believe: those are essentially a tax on the rich who don't qualify for student aid. [2]

            1: https://fred.stlouisfed.org/series/MEHOINUSA672N

            2: https://econofact.org/how-much-does-college-really-cost

            • jfindper 19 hours ago

              I have several qualms with how the real median household income is calculated, specifically the consumer price index.

              But I agree that tackling housing alone would be significant.

              • reissbaker 17 hours ago

                I think everyone has quibbles about the CPI. Ultimately though, it would take a lot of cherry-picking to make it seem like overall cost of living has gone up 3x while wages have gone up less. As a counterexample, an NES game in 1990 cost $50 new (in 1990 dollars! Not adjusted for inflation). Battlefield 6 cost $70 new this year (in 2025 dollars), and there were widespread complaints about games getting "too expensive." In real terms games have become massively less expensive — especially considering that the budget for Battlefield 6 was $400MM, and the budget for Super Mario World in 1990 was less than $2MM.

                There are a zillion examples like this. Housing has gone way up adjusted for inflation, but many other things have gone way, way down adjusted for inflation. I think it's hard to make a case that overall cost of living has gone up faster than median wages, and the federal reports indicate the opposite: median real income has been going up steadily for decades.

                Housing cost is visible and (of course, since it's gone up so much) painful. But real median income is not underwater relative to the 90s. And there's always outrage when something costs more than it used to, even if that's actually cheaper adjusted for inflation: for example, the constant outrage about videogame prices, which have in fact massively declined despite requiring massively more labor to make and sell.

          • AnthonyMouse a day ago

            You're identifying the right problem (school and housing costs are completely out of hand) but then resorting to an ineffective solution (minimum wage) when what you actually need is to get those costs back down.

            The easy way to realize this is to notice that the median wage has increased by proportionally less than the federal minimum wage has. The people in the middle can't afford school or housing either. And what happens if you increase the minimum wage faster than overall wages? Costs go up even more, and so does unemployment when small businesses who are also paying those high real estate costs now also have to pay a higher minimum wage. You're basically requesting the annihilation of the middle class.

            Whereas you make housing cost less and that helps the people at the bottom and the people in the middle.

            • jfindper a day ago

              >resorting to an ineffective solution (minimum wage) when what you actually need is to get those costs back down.

              I'm not really resorting to any solution.

              My comment is pointing out that when you only do one side of the equation (income) without considering the other side (expenses), it's worthless. Especially when you are trying to make a comparison across years.

              How we go about fixing the problem, if we ever do, is another conversation. But my original comment doesn't attempt to suggest any solution, especially not one that "requests the annihilation of the middle class". It's solely to point out that adventured's comment is a bunch of meaningless numbers.

              • AnthonyMouse a day ago

                > It's solely to point out that adventured's comment is a bunch of meaningless numbers.

                The point of that comment was to point out that minimum wage is irrelevant because basically nobody makes that anyway; even the entry-level jobs pay more than the federal minimum wage.

                In that context, arguing that the higher-than-minimum wages people are actually getting still aren't sufficient implies an argument that the minimum wage should be higher than that. And people could read it that way even if it's not what you intended.

                So what I'm pointing out is that that's the wrong solution and doing that rather than addressing the real issue (high costs) is the thing that destroys the middle class.

                • jfindper a day ago

                  >implies an argument that the minimum wage should be higher than that.

                  It can also imply that expenses should come down, you just picked the implication you want to argue against.

                  • AnthonyMouse a day ago

                    Exactly. When it's ambiguous at best it's important that people not try to follow the bad fork.

            • Sohcahtoa82 17 hours ago

              > (school and housing costs are completely out of hand)

              On the housing side, the root problem is obvious:

              Real estate cannot be both affordable and considered an investment. If it's affordable, that means the price is staying flat relative to inflation, which makes it a poor investment. If it's a good investment, that means the value is rising faster than inflation, which means unaffordability is inevitable.

              The solution to the housing crisis is simple: Build more. But NIMBYs and complex owners who see their house/complex as an investment will fight tooth-and-nail against any additional supply since it could reduce their value.

              • AnthonyMouse 3 hours ago

                > Real estate cannot be both affordable and considered an investment. If it's affordable, that means the price is staying flat relative to inflation, which makes it a poor investment. If it's a good investment, that means the value is rising faster than inflation, which means unaffordability is inevitable.

                This is a misunderstanding of what makes something a good investment. Something is a good investment if it's better for you than your other alternatives.

                Suppose you buy a house and then have a mortgage payment equivalent to the amount you'd have been paying in rent until the mortgage is paid off. At that point you have an asset worth e.g. $200,000 and you no longer have a mortgage payment. By contrast, if you'd been paying rent instead then you'd have to continue paying rent. That makes the house a good investment even if its value hasn't increased by a single cent since you bought it -- it could even have been a good investment if its value has gone down, because its true value is in not having to pay rent. Paying $300,000 over time for a house which is now worth $200,000 leaves you $200,000 ahead of the person who paid $300,000 in rent in order to end up with the asset you can find on the inside of an empty box.

                Likewise, suppose you're in the landlord business. In one city it costs a million dollars to buy a two bedroom unit and then you can rent it out for $10,000/month. In another city the same two bedroom unit costs $10,000 to buy but then you could only rent it out for $100/month. If your business is to buy the property and rent it out, is one of these a better investment than the other? No, the ROI is exactly the same for both of them and either one could plausibly be a good investment even without any appreciation.

                In both cases the value of the property doesn't have to increase to make it a good investment and in both cases the value of the property may not even come into play, because if you're planning to keep the asset in order to live in it or rent it out then you can't simultaneously sell it. And for homeowners, even if you were planning to sell it eventually, you'd then still need somewhere to live, so having all housing cost more isn't doing the average homeowner any good. If they sold they'd only have to pay the higher price to live somewhere else.

                However, there is one major difference between homeowners and landlords. If you increase the supply of housing, rents go down. For homeowners that doesn't matter, because they're "renting" to themselves; they pay (opportunity cost) and receive (imputed rent) in equal amounts, so it doesn't matter to them if local rents change -- or it benefits them because it lowers local cost of living and then they pay lower prices for local things. Whereas landlords will fight you on that to their last breath, because that's their actual return on investment. Which is why they're the villains and they need to lose.

          • genewitch a day ago

            1980 mustang vs 2025 mustang is what i usually use. in the past 12 years my price per KWh electricity costs have doubled.

            in the mid 90s you could open a CD (certificate of deposit at a bank or credit union) and get 9% or more APY. savings accounts had ~4% interest.

            in the mid 90s a gallon of gasoline in Los Angeles county was $0.899 in the summer and less than that any other time. It's closer to $4.50 now.

          • mrits a day ago

            The BBQ place across the street from me pays $19/hour to be a cashier in Austin. Or the sign says it does anyways

            • mossTechnician a day ago

              Does the sign happen to have the words "up to" before the dollar amount?

            • jfindper a day ago

              sweet! according to austintexas.gov, that's only $2.63 below the 2024 living wage. $5.55 below, if you use the MIT numbers for 2025.

              As long as you don't run into anything unforseen like medical expenses, car breakdowns, etc., you can almost afford a bare-bones, mediocre life with no retirement savings.

              • hylaride 20 hours ago

                I don't disagree that there has been a huge issue with stagnant wages, but not everybody who works minimum wage needs to make a living wage. Some are teenagers, people just looking for part time work, etc. Pushing up minimum wage too high can risk destroying jobs that are uneconomical at that level that could have been better than nothing for many people.

                That being said, there's been an enormous push by various business groups to do everything they can to keep wages low.

                It's a complicated issue and one can't propose solutions without acknowledging that there's a LOT of nuance...

                • jfindper 20 hours ago

                  >but not everybody who works minimum wage needs to make a living wage

                  I think this is a distraction that is usually rolled out to derail conversations about living wages. Not saying that you're doing that here, but it's often the case when the "teenager flipping burgers" argument is brought up.

                  Typically in conversations about living wages, people are talking about financially independent adults trying to make their way through life without starving while working 40 hours per week. I don't think anyone is seriously promoting a living wage for the benefit of financially dependent minors.

                  And, in any case, the solution could also be (totally, or in part) a reduction in expenses instead of increase in income.

                  >It's a complicated issue and one can't propose solutions without acknowledging that there's a LOT of nuance...

                  That's for sure! I know it's not getting solved on the hacker news comment section, at least.

                  • hylaride 19 hours ago

                    > I think this is a distraction that is usually rolled out to derail conversations about living wages. Not saying that you're doing that here, but it's often the case when the "teenager flipping burgers" argument is brought up.

                    If you're focusing on minimum wage, they tent to be highly coupled, though some jurisdictions have lower minimum wages for minors to deal with this.

                    > Typically in conversations about living wages, people are talking about financially independent adults trying to make their way through life without starving while working 40 hours per week. I don't think anyone is seriously promoting a living wage for the benefit of financially dependent minors.

                    Few minimum wage jobs even offer the option to work full time. Many retail environments have notoriously unpredictable shifts that are almost impossible for workers to plan around. I've heard varying reasons for this (companies like having more employees working fewer hours for flexibility down to avoiding people on the full time payroll means they legally don't have to offer benefits). The result is that minimum wage earners often have to juggle multiple jobs, childcare, and the negative effects of commuting to all of them.

                    This also ignores many other factors around poverty, such as housing costs and other inflation.

                    > That's for sure! I know it's not getting solved on the hacker news comment section, at least.

                    For sure! 99% of people on HN haven't had to experience living long term off of it. I did for awhile in college, where outside of tuition I had to pay my own way in a large city (I fully acknowledge that this is anecdotal and NOT the same as poverty living). I only had to feed myself, not think about saving for the future, and I was sharing a house with other geeky roommates where we had some of the best times of our lives. I don't think we could have pulled that off in today's economic environment...

                • tossandthrow 8 hours ago

                  This is a complete strawman.

                  The part time workers has been sorted out as living wage calculations assume full time work.

                  Even if you are a teenager you deserve a living wage - if a teenager living at home needs to work full time, then that home likely need some of those money.

        • GoatInGrey a day ago

          Counterpoint: affording average rent for a 1-bedroom apartment (~$1,675) requires that exact median full-time wage. $15 an hour affords you about $740 for monthly housing expenses. One can suggest getting two roommates for a one-bedroom apartment, but they would be missing the fact that this is very unusual for the last century. It's more in line with housing economics from the early-to-mid 19th century.

        • mossTechnician a day ago

          In addition to the other comments, I presume the big box retailers do not hire for full-time positions when they don't have to, and gig economy work is rapidly replacing jobs that used to be minimum wage.

        • yndoendo a day ago

          My uncle was running a number of fast food restaurants for a franchise owner making millions. His statement about this topic is simple, "they are not living wage jobs ... go into manufacturing if you want a living wage".

          I don't like my uncle at all and find him and people like him to be terrible human beings.

          • The-Bus a day ago

            If a business can't pay a living wage, it's not really a successful business. I, too, could become fabulously wealthy selling shoes if someone just have me shoes for $1 so I could resell them for $50.

            • AnthonyMouse a day ago

              > If a business can't pay a living wage, it's not really a successful business.

              Let's consider the implications of this. We take an existing successful business, change absolutely nothing about it, but separately and for unrelated reasons the local population increases and the government prohibits the construction of new housing.

              Now real estate is more scarce and the business has to pay higher rent, so they're making even less than before and there is nothing there for them to increase wages with. Meanwhile the wages they were paying before are now "not a living wage" because housing costs went way up.

              Is it this business who is morally culpable for this result, or the zoning board?

              • array_key_first 14 hours ago

                Successfulness and morality are orthogonal. If you can't make money wherever you're operating your business, then you're not successful.

                • AnthonyMouse 4 hours ago

                  But in that case they are successful; they're just not paying very much relative to the cost of living as a result of someone else's imposition of artificial scarcity

              • FireBeyond 20 hours ago

                There are certainly elements of this. And there are also elements like my city, where some of the more notable local business owners and developers are all _way too cozy_ with the City Council and Planning/Zoning Boards (like not just rubbing shoulders at community events, fundraisers, but in the "our families rent AirBnBs together and go on vacation together) which gives them greater influence.

                All that being said, though, Robert Heinlein said once:

                > There has grown up in the minds of certain groups in this country the notion that because a man or corporation has made a profit out of the public for a number of years, the government and the courts are charged with the duty of guaranteeing such profit in the future, even in the face of changing circumstances and contrary to the public interest. This strange doctrine is not supported by statute or common law. Neither individuals nor corporations have any right to come into court and ask that the clock of history be stopped, or turned back.

                • AnthonyMouse 4 hours ago

                  > And there are also elements like my city, where some of the more notable local business owners and developers are all _way too cozy_ with the City Council and Planning/Zoning Boards (like not just rubbing shoulders at community events, fundraisers, but in the "our families rent AirBnBs together and go on vacation together) which gives them greater influence.

                  But now you're just condemning the zoning board and their cronies as it should be, as opposed to someone else who can't pay higher wages just because real estate got more expensive since it got more expensive for them too.

                  > Neither individuals nor corporations have any right to come into court and ask that the clock of history be stopped, or turned back.

                  Which is basically useless in this context because when costs increase you could apply it equally to not raising the minimum wage (the individual has to suck it up) or raising the minimum wage (the small business owner has to suck it up). Meanwhile neither of them should have to suck it up because we should instead be getting the costs back down.

            • raw_anon_1111 a day ago

              Can we use the same argument for all of the businesses that are only surviving because of VC money?

              I find it rich how many tech people are working for money losing companies, using technology from money losing companies and/or trying to start a money losing company and get funding from a VC.

              Every job is not meant to support a single person living on their own raising a family.

              • dpkirchner a day ago

                That's what VC money is for. When it comes to paying below a living wage, we typically expect the government to provide support to make up the difference (so they're not literally homeless). Businesses that rely on government to pay their employees should not exist.

                • raw_anon_1111 a day ago

                  That’s kind of the point, a mom and pop restaurant or a McDonald’s franchise owner doesn’t have the luxury of burning $10 for every $1 in revenue for years and being backed by VC funding.

                  Oh and the average franchise owner is not getting rich. They are making $100K a year to $150K a year depending on how many franchises they own.

                  Also tech companies can afford to pay a tech worker more money because you don’t have to increase the number of workers when you get more customers.

                  YC is not going to give the aspiring fast food owner $250K to start their business like they are going to give “pets.ai - AI for dog walkers”

                  • dpkirchner 20 hours ago

                    In that case they probably shouldn't be running a McDonald's. They aren't owed that and they shouldn't depend on their workers getting government support just so the owners can "earn" their own living wage.

                    • raw_anon_1111 20 hours ago

                      Yet tech workers are “owed” making money because they are in an industry where their employers “deserve” to survive despite losing money because they can get VC funding - funded by among others government pension plans?

                      I find it slightly hypocritical that people can clutch their pearls at small businesses who risk their own money while yet another BS “AI” company’s founders can play founder using other people’s money.

            • CamperBob2 a day ago

              Classically, not all jobs are considered "living wage" jobs. That whole notion is something some people made up very recently.

              A teenager in his/her first job at McDonald's doesn't need a "living wage." As a result of forcing the issue, now the job doesn't exist at all in many instances... and if it does, the owner has a strong incentive to automate it away.

              • autoexec a day ago

                > A teenager in his/her first job at McDonald's doesn't need a "living wage." As a result of forcing the issue, now the job doesn't exist at all in many instances

                The majority of minimum wage workers are adults, not teenagers. This is also true for McDonald's employees. The idea that these jobs are staffed by children working summer jobs is simply not reality.

                Anyone working for someone else, doing literally anything for 40 hours a week, should be entitled to enough compensation to support themselves at a minimum. Any employer offering less than that is either a failed business that should die off and make room for one that's better managed or a corporation that is just using public taxpayer money to subsidize their private labor expenses.

              • kube-system 21 hours ago

                A teenager is presumably also going to school full time and works their job part time, not ~2000 hours per year.

                If we build a society where someone working a full time job is not able to afford to reasonably survive, we are setting ourselves up for a society of crime, poverty, and disease.

              • array_key_first 14 hours ago

                Just the simple fact that mcdonalds is open during school hours is enough to demolish the "teenagers flipping burgers" type arguments.

              • swiftcoder a day ago

                > A teenager in his/her first job at McDonald's doesn't need a "living wage."

                Turns out our supply of underage workers is neither infinite, nor even sufficient to staff all fast food jobs in the nation

              • jfindper a day ago

                >A teenager in his/her first job at McDonald's doesn't need a "living wage."

                Wow, a completely bad-faith argument.

                Can you try again, but this time, try "steelman" instead of "strawman"?

        • zzzeek a day ago

          in that case it should be completely uncontroversial to raise the minimum wage and help that .5% of labor out. yet somehow, it's a non-starter. (btw, googling says the number is more like 1.1%. in 1979, 13.4% of the labor force made minimum wage. this only shows how obsolete the current minimum wage level is).

    • nickjj a day ago

      > The mid 90s was pretty scary too.

      If you fast forward just a few years though, it wasn't too bad.

      You could put together a decent fully parted out machine in the late 90s and early 00s for around $600-650. These were machines good enough to get a solid 120 FPS playing Quake 3.

    • microtonal a day ago

      That's kinda like saying the mid-20s were pretty scary too, minimum wage was AMOUNT and a MacBook M4 Max was $3000..

      In the mid-90s me and my brother were around 14 and 10, earning nothing but a small amount of monthly pocket money. We were fighting so much over our family PC, that we decided to save and put together a machine from second-hands parts we could get our hands on. We built him a 386 DX 40 or 486SX2 50 or something like that and it was fine enough for him to play most DOS games. Heck, you could even run Linux (I know because I ran Linux in 1994 on a 386SX 25, with 5MB RAM and 20MB disk space).

      • kube-system a day ago

        > That's kinda like saying the mid-20s were pretty scary too, minimum wage was AMOUNT and a MacBook M4 Max was $3000..

        A powerbook 5300 was $6500 in 1995, which is $13,853 today.

        • kergonath 21 hours ago

          > A powerbook 5300 was $6500 in 1995

          The TCO was much higher, considering how terrible and flimsy this laptop was. The power plug would break if you looked at it funny and the hinge was stiff and brittle. I know that’s not the point you are making but I am still bitter about that computer.

      • sejje 21 hours ago

        Linux notoriously runs on worse hardware than almost anything, especially in the 90s

    • morsch a day ago

      Are you sure? From what I can tell it's more like 500 USD RRP on release, boxed.

      Either way, it was the 90s: two years later that was a budget CPU because the top end was two to three times the speed.

    • trollbridge a day ago

      In the mid 90s mere mortals ran a 486DX2 or DX4.

      Pentium 60/66s were in the same price tier as expensive alpha or sparc workstations.

  • Barathkanna a day ago

    I agree with you on SSDs, that was the last upgrade that felt like flipping the “modern computer” switch overnight. Everything since has been incremental unless you’re doing ML or high-end gaming.

    • asenna a day ago

      I know it's not the same. But I think a lot of people had a similar feeling going from Intel-Macbooks to Apple Silicon. An insane upgrade that I still can't believe.

      • crazygringo a day ago

        This. My M1 MacBook felt like a similarly shocking upgrade -- probably not quite as much as my first SSD did, but still the only other time when I've thought, "holy sh*t, this is a whole different thing".

      • wongarsu a day ago

        The M1 was great. But the jump felt particularly great because Intel Macbooks had fallen behind in performance per dollar. Great build quality, great trackpad, but if you were after performance they were not exactly the best thing to get

        • skylurk a day ago

          For as long as I can remember, before M1, Macs were always behind in the CPU department. PC's had much better value if you cared about CPU performance.

          After the M1, my casual home laptop started outperforming my top-spec work laptops.

          • kergonath 21 hours ago

            > For as long as I can remember, before M1, Macs were always behind in the CPU department. PC's had much better value if you cared about CPU performance.

            But not if you cared about battery life, because that was the tradeoff Apple was making. Which worked great until about 2015-2016. The parts they were using were not Intel’s priority and it went south basically after Broadwell, IIRC. I also suppose that Apple stopped investing heavily into a dead-end platform while they were working on the M1 generation some time before it was announced.

      • redwall_hp a day ago

        I usually use an M2 Mac at work, and haven't really touched Windows since 2008. Recently I had to get an additional Windows laptop (Lenovo P series) for a project my team is working on, and it is such a piece of shit. It's unfathomable that people are tolerating Windows or Intel (and then still have the gall to talk shit about Macs).

        It's like time travelling back to 2004. Slow, loud fans, random brief freezes of the whole system, a shell that still feels like a toy, a proprietary 170W power supply and mediocre battery life, subpar display. The keyboard is okay, at least. What a joke.

        Meanwhile, my personal M3 Max system can render Da Vinci timelines with complex Fusion compositions in real time and handle whole stacks of VSTs in a DAW. Compared to the Lenovo choking on an IDE.

        • array_key_first 14 hours ago

          A lot of this is just windows sucking major balls. Linux distros with even the heaviest DEs like KDE absolutely fly on mediocre or even low range hardware.

          I got a lunar lake laptop and slapped fedora on it and everything is instant. And I hooked up 2 1400p/240hz over thunderbolt.

        • ponector 21 hours ago

          There will be not so big difference if you compare laptops in the same price brackets. Cheap PCs are crap.

          • Kirby64 19 hours ago

            > Cheap PCs are crap.

            Expensive PCs are also crap. My work offers Macbooks or Windows laptops (currently, Dell, but formerly Lenovo and/or HP), and these machines are all decidedly not 'cheap' PCs. Often retailing in excess of $2k.

            All my coworkers who own Windows laptops do is bellyache about random issues, poor battery life, and sluggish performance.

            I used to have a Windows PC for work about 3 years ago as well, and it was also a piece of crap. Battery would decide to 'die' at 50% capacity. After replacement, 90 minute battery life off charger. Fan would decide to run constantly if you did anything even moderately intensive such as a Zoom meeting.

      • bigyabai a day ago

        It's a lot more believable if you tried some of the other Wintel machines at the time. Those Macbook chassis were the hottest of the bunch, it's no surprise the Macbook Pro was among the first to be redesigned.

    • simlevesque a day ago

      I've had this with gen5 PCIe SSDs recently. My T710 is so fast it's hard to believe. But you need to have a lot of data to make it worth.

      Example:

          > time du -sh .
          737G .
          ________________________
          Executed in   24.63 secs
      
      And on my laptop that has a gen3, lower spec NVMe:

          > time du -sh .
          304G .
          ________________________
          Executed in   80.86 secs
      
      
      It's almost 10 times faster. The CPU must have something to do with it too but they're both Ryzen 9.
      • adgjlsfhk1 a day ago

        To me that reads 3x, not "almost 10x". The main differrence here is probably power. A desktop/server is happy to send 15W to the SSD and hundreds of watts to the CPU, while a laptop wants the SSD running in the ~1 watt range and the CPU in the 10s of watts range.

        • simlevesque a day ago

          There's over twice as much content in the first test. It's around 3.8gb/s vs 30gb/s if you divide both folder size and both du durations. That makes it 7.9 times faster and I'm comfortable calling this "almost 10 times".

          • adgjlsfhk1 a day ago

            oops. I missed the size diff. that's a solid 8x. that's cool!

      • taneliv a day ago

        I believe you, but your benchmark is not very useful. I get this on two 5400rpm 3T HDDs in a mirror:

            $ time du -sh .
            935G    .
                                                                                                                                  
            real    0m1.154s
        
        Simply because there's less than 20 directories and the files are large.
        • simlevesque a day ago

          I should have been more clear: It's my http cache for my crawling jobs. Lots of files in many shapes.

          My new setup: gen5 ssd in desktop:

              > time find . -type f | wc -l
              5645741
              ________________________
              Executed in    4.77 secs
          
          My old setup, gen3 ssd in laptop:

              > time find . -type f | wc -l
              2944648
              ________________________
              Executed in   27.53 secs
          
          Both are running pretty much non-stop, very slowly.
    • pstadler a day ago

      This and high resolution displays, for me at least.

    • jug a day ago

      I thought so too on my mini PC. Then I got myself my current Mac mini M4 and I have to give it to Apple, or maybe in part to ARM... It was like another SSD moment. It's still not spun up the fan and run literally lukewarm at most my office, coding and photo work.

    • wdfx a day ago

      The only time I had this other than changing to SSD was when I got my first multi-core system, a Q6600 (confusingly labeled a Core 2 Quad). Had a great time with that machine.

      • genewitch a day ago

        "Core" was/is like "PowerPC" or "Ryzen", just a name. Intel Core i9, for instance, as opposed to Intel Pentium D, both x86_x64, different chip features.

  • prmoustache a day ago

    As other mentionned, there are plenty of refurbished stuff and second hand parts that there isn't any risk of finding yourself having to buy something at insane prices if your computer was to die today.

    If you don't need a GPU for gaming you can get a decent computer with an i5, 16GB of ram and an nvme drive for usd 50. I bought one a few weeks ago ago.

  • forinti a day ago

    You can still get brand new generic motherboards for old CPUs.

    I swapped out old ASUS MBs for an i3-540 and an Athlon II X4 with brand new motherboards.

    They are quite cheaper than getting a new kit, so I guess that's the market they cater to: people who don't need an upgrade but their MBs gave in.

    You can get these for US$20-US$30.

  • davely a day ago

    About a month ago, the mobo for my 5950x decided to give up the ghost. I decided to just rebuild the whole thing and update from scratch.

    So went crazy and bought a 9800X3D, purchased a ridiculous amount of DDR5 RAM (96GB, which matches my old machine’s DDR4 RAM quantity). At the time, it was about $400 USD or so.

    I’ve been living in blissful ignorance since then. Seeing this post, I decided to check Amazon. The same amount of RAM is currently $1200!!!

    • VHRanger a day ago

      Same, I got 96GB of high end 6000MHz DDR5 this summer for $600CAD and now it's nearly triple at $1500CAD

    • genewitch a day ago

      what are you doing with that old 5950x?

  • mikepurvis a day ago

    For a DDR3-era machine, you'd be buying RAM for that on Ebay, not Newegg.

    I have an industrial Mini-ITX motherboard of similar vintage that I use with an i5-4570 as my Unraid machine. It doesn't natively support NVMe, but I was able to get a dual-m2 expansion card with its own splitter (no motherboard bifurcation required) and that let me get a pretty modern-feeling setup with nice fast cache disks.

  • snickerbockers 21 hours ago

    >For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.

    I only ever noticed it on my windows partition. IIRC on my linux partition it was hardly noticeable because Linux is far better at caching disk contents than windows and also linux in general can boot surprisingly fast even on HDDs if you only install modules you actually need so that the autoconfiguration doesn't waste time probing dozens of modules in search of the best one.

    • dotancohen 10 hours ago

      How to determine which modules are actually needed, on a Debian system?

      • snickerbockers an hour ago

        IDK; at the time i was using gentoo, in which it's natural not to have more modules than necessary because part of the installation process involves generating your own kernel configuration.

        Even though it's not the normal way to install debian there ought to be some sort of way to build your own custom kernels and modules without interferance from the package manager (or you can just run it all manually and hope that you dont end up in a conflict with apt). Gentoo is the only distro where it's mandatory but im pretty sure this is supported on just about every distribution since it would be necessary for maintainers.

      • kasabali 6 hours ago

        Set modules=dep in (https://manpages.debian.org/jessie/initramfs-tools/initramfs...) and run update-initramfs.

        I doubt that'll affect boot time but it reduces initrd.img size a lot.

        • snickerbockers an hour ago

          maybe not on an SSD but it definitely helps a lot on HDD by virtue of having far less disk traffic. The kernel's method for figuring out which modules to load is effectively to load every single module that might be compatible with a given device in series and then ask the module for its opinion before unloading it, and then once it has a list of all (self-reported) compatible modules for a given device it picks one and reloads it.

        • dotancohen 2 hours ago

          Thank you. I'll spend some time to learn why that is not the default config setting.

  • aposm a day ago

    A few years later but similarly - I am still running a machine built spur-of-the-moment in a single trip to Micro Center for about $500 in late 2019 (little did we know what was coming in a few months!). I made one small upgrade in probably ~2022 to a Ryzen 5800X w/ 64GB of RAM but otherwise untouched. It still flies through basically anything & does everything I need, but I'm dreading when any of the major parts go and I have to fork out double or triple the original cost for replacements...

  • acters a day ago

    I am still running an i5 4690k, really all I need is better GPU but those prices are criminal. I wish I got a 4090 when I had the chance rip

    • genewitch a day ago

      intel arc b580 (i think that's the latest one) isn't obnoxiously priced but you're going to have to face the fact that your PCIE is really very slow. But it should work.

      if you want to save even more money get the older Arc Battlemage GPUs. I used one it was comparable with an RTX 3060; i returned it because the machine i was running it in had a bug that was fixed 2 days before i returned it but i didn't know that.

      I was seriously considering getting a b580 or waiting until the b*70 came out with more memory, although at this point i doubt it will be very affordable considering VRAM prices going up as well. A friend is supposedly going to ship me a few GTX 1080ti cards so i can delay buying newer cards for a bit.

      • TheAmazingRace 19 hours ago

        By older Arc, I presume you're referring to Alchemist and not Battlemage in this case.

        One of my brothers has a PC I built for him, specced out with an Intel Core i5 13400f CPU and an Intel Arc A770 GPU, and it still works great for his needs in 2025.

        Surely, Battlemage is more efficient and more compatible in some ways over Alchemist. But if you keep your expectations in check, it will do just fine in many scenarios. Just avoid any games using Unreal Engine 5.

        • genewitch 18 hours ago

          yeah i had an A770; it should be ~$200-$250 now on ebay, lightly used. It's, in my opinion, worth about $200, if it's relatively unused. As i mentioned, it's ~= RTX 3060 at least for compute loads, and the 16GB is nice to have for that. But for a computer from the 4th gen i'd probably only get a A380 or A580; the A380 is $60-$120 on ebay.

      • interloxia 20 hours ago

        Note that some tinkering may be required for modern cards on old systems.

        - A UEFI DXE driver to enable Resizable BAR on systems which don't support it officially. This provides performance benefits and is even required for Intel Arc GPUs to function optimally.

        List of working motherboards

        https://github.com/xCuri0/ReBarUEFI/issues/11

        • genewitch 19 hours ago

          you need to enable rebar even for gaming? i had to enable rebar for pytorch usage (the oneAPI requires it iirc).

  • phantasmish a day ago

    I’m worried about the Valve mini PC coming out next year.

    Instant buy $700 or under. Probably buy up to $850. At, like, $1,100, though… solid no. And I’m counting on that thing to take the power-hog giant older Windows PC tower so bulky it’s unplugged and in a closet half the time, out of my house.

  • KronisLV a day ago

    If I needed a budget build, I'd probably look in the direction of used parts on AliExpress, you can sometimes find good deals on AM4 CPUs (that platform had a lot of longevity, even now my main PC has a Ryzen 7 5800X) and for whatever reason RX 580 GPUs were really, really widespread (though typically the 2048SP units). Not amazing by any means, but a significant upgrade from your current setup and if you don't get particularly unlucky, it might last for years with no issues.

    Ofc there's also the alternate strategy of going for a mid/high end rig and hoping it lasts a decade, but the current DDR5 prices make me depressed so yeah maybe not.

    I genuinely hope that at some point the market will get flooded with good components with a lot of longevity and reasonable prices again in the next gens: like AM4 CPUs, like that RX 580, or GTX 1080 Ti but I fear that Nvidia has learnt their lesson in releasing stuff that pushes you in the direction of incremental upgrades rather than making something really good for the time, same with Intel's LGA1851 being basically dead on arrival, after the reviews started rolling in (who knows, maybe at least mobos and Core Ultra chips will eventually be cheap as old stock). On the other hand, at least something like the Arc B580 GPUs were a step in the right direction - competent and not horribly overpriced (at least when it came to MSRP, unfortunately the merchants were scumbags and often ignored it).

  • the__alchemist a day ago

    Man, it was just GPU for a while. But same boat. I regret not getting the 4090 for $1600 direct from Nvidia. "That's too much for a video card", and got the 4080 instead. I dread the day when I need to replace it.

    • jakogut a day ago

      The Radeon RX 9070 XT performs at a similar level to the RTX 5070, and is retailing around $600 right now.

      • adrift 2 hours ago

        Unfortunately, AMD drivers are beyond terrible and you'll experience frequent timeouts.

      • the__alchemist a day ago

        No CUDA means not an option for me.

        • the__alchemist a day ago

          > What kinds of applications do you use that require CUDA?

          Molecular dynamics simulations, and related structural bio tasks.

          • vlovich123 a day ago

            Is the CUDA compat layer AMD has that transparently compiled existing CUDA just fine insufficient somehow or buggy somehow? Or are you just stuck in the mindshare game and haven’t reevaluate whether the AMD situation has changed this year?

            • the__alchemist 21 hours ago

              I haven't checkout out AMD's transparency layer and know nothing about it. I tried to get vkFFT working in addition to cuFFT for a specific computation, but can't get it working right; crickets on the GH issue I posted.

              I use Vulkan for graphics, but Vulkan compute is a mess.

              I'm not in a mindshare, and this isn't a political thing. I am just trying to get the job done, and have observed that no alternative has stepped up to nvidia's CUDA from a usability perspective.

              • vlovich123 20 hours ago

                I didn’t talk about Vulkan compute.

                > have observed that no alternative has stepped up to nvidia's CUDA from a usability perspective.

                I’m saying this is a mindshare thing if you haven’t evaluated ROCm / HIP. HIPify can convert CUDA source to HIP automatically and HIP is very similar syntax to CUDA.

            • jakogut 20 hours ago

              There's also ZLUDA, which can run llama.cpp and some other CUDA workloads already without any modification, but it's still maturing.

        • jakogut a day ago

          What kinds of applications do you use that require CUDA?

  • hnu0847 a day ago

    Don't all RAM manufacturers offer a lifetime warranty?

    That said, if the shortage gets bad enough then maybe they could find themselves in a situation where they were unable/unwilling to honor warranty claims?

  • PHGamer 17 hours ago

    you should upgrade to a used or new 12900k with DDR4 since its still cheaper than dd5 even if it is up. then get a used 3080ti with 12gb. youll be able to do proper h.265 decode/encode with that for up to 4k (unfortuantly not 8k but hey no one really has that yet)

  • square_usual a day ago

    You can still buy DDR4 for pretty cheap, and if you're replacing a computer that old any system built around DDR4 will still be a massive jump in performance.

  • ls612 a day ago

    GPU prices are actually at MSRP now for most cards other than the 5090.

    • bilegeek 20 hours ago

      Problem is MSRP is also inflated, and Covid has locked that in. Arc Battlemage is the only exception I see.

      • ls612 19 hours ago

        You’ve never been able to buy more GPU performance per dollar than you can today.

        • kcb 19 hours ago

          That's not very encouraging because that statement has been true most every day in computing for the past 50 years. The rate at which the compute per dollar increases is what matters.

          • ls612 18 hours ago

            Yeah moores law is slowing down for sure. I’m just pushing back on the whole sky is falling doomerism in the PC community. I will admit I’m lucky that my current system with 64GB of memory and a 4090 is likely to be good for years to come so I can wait out the ram shortage.

  • nipperkinfeet 13 hours ago

    I'm glad I kept my ageing Dell Studio XPS 7100. I need to stock up on some Dell Precision towers from the surplus in case my Studio XPS 7100 breaks. This AI bubble needs to burst soon.

  • adventured a day ago

    You could still easily build a $800-$900 system that would dramatically jump forward from that machine.

    $700 in 2014 is now $971 inflation adjusted (BLS calculator).

    RTX 3060 12gb $180 (eBay). Sub $200 CPU (~5-7 times faster than yours). 16gb DDR4 $100-$120. $90 PSU. $100 motherboard. WD Black 1tb SSD $120. Roughly $800 (which inflation adjusted beats your $700).

    Right now is a rather amazing time for CPUs, even though RAM prices have gone crazy.

    Assume you find some deals somewhere in there, you could do slightly better with either pricing or components.

  • rasz 14 hours ago

    Do yourself a favor and order $25 Xeon E3-1231 v3/E3-1241 v3 from China. Those work in all desktop motherboards. Used DDR3 ram is also so cheap you can bump to 32GB for another $20.

    easy cheap upgrades. CPU will be noticeable straight away, ram only if you are running out.

  • TacticalCoder a day ago

    > For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.

    The last one were I really remember seeing a huge speed bump was going from a regular SSD to a NVMe M.2 PCIe SSD... Around 2015 I bought one of the very first consumer motherboard with a NVMe M.2 slot and put a Samsung 950 Pro in it: that was quite something (now I was upgrading the entire machine, not just the SSD, so there's that too). Before that I don't remember when I switched from SATA HDD to SATA SSD.

    I'm now running one of those WD SN850X Black NVMe SSD but my good old trusty, now ten years old, Samsung 950 Pro is still kicking (in the wife's PC). There's likely even better out there and they're easy to find: they're still reasonably priced.

    As for my 2015 Core i7-6700K: it's happily running Proxmox and Docker (but not always on).

    Even consumer parts are exceptionally reliable: the last two failures I remember, in 15 years (and I've got lots of machines running), are a desktop PSU (replaced by a Be Quiet! one), a no-name NVMe SSD and a laptop's battery.

    Oh and my MacBook Air M1's screen died overnight for no reason after precisely 13 months, when I had a warranty of 12 months, (some refer to it as the "bendgate") but that's because first gen MacBook Air M1 were indescribable pieces of fragile shit. I think Apple got their act together and came up with better screens in later models.

    Don't worry too much: PCs are quite reliable things. And used parts for your PC from 2014 wouldn't be expensive on eBay anyway. You're not forced to upgrade to a last gen PC with DDR5 (atm 3x overpriced) and a 5090 GPU.

    • dontlaugh 19 hours ago

      It’s motherboards that tend to fail the most. I had one fail this year, albeit for the first time.

    • genewitch a day ago

      fyi someone or something is downvoting your recent posts to oblivion, and i didn't see any obvious reason.

  • testing22321 a day ago

    I got a used M1 MacBook Air a year ago.

    By far the fastest computer I’ve ever used. It felt like the SSD leap of years earlier.

  • sneak a day ago

    Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum? I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.

    I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?

    Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.

    Spend at least 1% of your gross revenue on your tools used to make that revenue.

    • Clent a day ago

      This is a crazy out of touch perspective.

      Depending on salary, 2 magnitudes at $5k is $500k.

      That amount of money for the vast majority of humans across the planet is unfathomable.

      No one is worried about if the top 5% can afford DRAM. Literally zero people.

      • sneak a day ago

        The vast majority of humans across the planet aren’t making their money with their computer, which was the qualifier in the first line of my comment.

        Furthermore, even if they did, the vast majority of them still won’t be using their computer to generate revenue - they’ll be using an employer-provided one and the things I’m talking about have nothing to do with them.

    • macNchz a day ago

      What is the actual return on that investment, though? This is self indulgence justified as « investment ». I built a pretty beefy PC in 2020 and have made a couple of upgrades since (Ryzen 5950x, 64GB RAM, Radeon 6900XT, a few TB of NVMe) for like $2k all-in. Less than $40/month over that time. It was game changing upgrade from an aging laptop for my purposes of being able to run multiple VMs and a complex dev environment, but I really don’t know what I would have gotten out of replacing it every year since. It’s still blazing fast.

      Even recreating it entirely with newer parts every single year would have cost less than $250/mo. Honestly it would probably be negative ROI just dealing with the logistics of replacing it that many times.

      • crazygringo a day ago

        > This is self indulgence justified as « investment ».

        Exactly that. There's zero way that level of spending is paying for itself in increased productivity, considering they'll still be 99% as productive spending something like a tenth of that.

        It's their luxury spending. Fine. Just don't pretend it's something else, or tell others they ought to be doing the same, right?

      • londons_explore a day ago

        Every hardware update for me involves hours or sometimes days of faffing with drivers and config and working round new bugs.

        Nobody is paying for that time.

        And whilst it is 'training', my training time is better spent elsewhere than battling with why cuda won't work on my GPU upgrade.

        Therefore, I avoid hardware and software changes merely because a tiny bit more speed isn't worth the hours I'll put in.

      • mikepurvis a day ago

        My main workstation is similar, basically a top-end AM4 build. I recently bumped from a 6600 XT to a 9070 XT to get more frames in Arc Raiders, but looking at what the cost would be to go to the current-gen platform (AM5 mobo + CPU + DDR5 RAM) I find myself having very little appetite for that upgrade.

      • sneak 6 hours ago

        The logistics of upgrading a Mac are:

        1) rsync home directory over to new machine

        2) generate new SE keys in secretive

        3) push new authorized_keys out to all servers and test (scripted)

        4) start using new machine

        5) wipe old machine

        It takes a few hours and most of it is waiting on 10GE rsync which only goes at like 3000Mbit and you can still use the source machine while it runs.

    • jfindper a day ago

      >I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year

      >I buy the best phones and desktops money can buy

      Sick man! Awesome, you spend 1/3 of the median US salary on a laptop and desktop every year. That's super fucking cool! Love that for you.

      Anyways, please go brag somewhere else. You're rich, you shouldn't need extra validation from an online forum.

    • ceejayoz a day ago

      > Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum?

      Yes. This is how we get websites and apps that don't run on a normal person's computer, because the devs never noticed their performance issues on their monster machines.

      Modern computing would be a lot better if devs had to use old phones, basic computers, and poor internet connections more often.

    • mitthrowaway2 a day ago

      Yes? I think that's crazy. I just maxed out my new Thinkpad with 96 GB of RAM and a 4 TB SSD and even at today's prices, it still came in at just about $2k and should run smoothly for many years.

      Prices are high but they're not that high, unless you're buying the really big GPUs.

      • sgerenser a day ago

        Where can you buy a new Thinkpad with 96GB and 4TB SSD for $2K? Prices are looking quite a bit higher than that for the P Series, at least on Lenovo.com in the U.S. And I don't see anything other than the P Series that lets you get 96GB of RAM.

        • mitthrowaway2 a day ago

          You have to configure it with the lowest-spec SSD and then replace that with an aftermarket 4 TB SSD at around $215. The P14s I bought last week, with that and the 8 GB Nvidia GPU, came to a total of USD $2150 after taxes, including the SSD. Their sale price today is not quite as good as it was last week but it's still in that ballpark; with the 255H CPU and iGPU and a decent screen, and you can get the Intel P14s for $2086 USD. That actually becomes $1976 because you get $110 taken off at checkout. Then throw in the aftermarket SSD and it'll be around $2190. And if you log in as a business customer you'll get another couple percent off as well.

          The AMD model P14s, with 96 GB and upgraded CPU and the nice screen and linux, still goes for under $1600 at checkout, which becomes $1815 when you add the aftermarket SSD upgrade.

          It's still certainly a lot to spend on a laptop if you don't need it, but it's a far cry from $5k/year.

        • lionkor a day ago

          Typing this on similar spec P16s that was around 2.6k or so. So if you call anything under 3k simply 2k, then it was 2k.

          Thats in Germany, from a corporate supplier.

    • gr4vityWall a day ago

      > maybe $250/month (...) which you can then use to go and earn 100x that.

      25k/month? Most people will never come close to earn that much. Most developers in the third world don't make that in a full year, but are affected by raises in PC parts' prices.

      I agree with the general principle of having savings for emergencies. For a Software Engineer, that should probably include buying a good enough computer for them, in case they need a new one. But the figures themselves seem skewed towards the reality of very well-paid SV engineers.

      • Dibby053 a day ago

        >Most developers in the third world don't make that in a full year

        And many in the first world haha

      • londons_explore a day ago

        > But the figures themselves seem skewed towards the reality of very well-paid SV engineers.

        The soon to be unemployed SV engineers when LLM's mean anyone can design an app and backend with no coding knowledge.

        • genewitch a day ago

          and you can code from an rpi / cellphone and use a cloud computer to run it so you actually don't really need an expensive PC at all

    • vultour a day ago

      Yes, that's an absolutely deranged opinion. Most tech jobs can be done on a $500 laptop. You realise some people don't even make your computer budget in net income every year, right?

      • sneak a day ago

        Most tech jobs could be done on a $25 ten year old smartphone with a cracked screen and bulging battery.

        That’s exactly my point. Underspending on your tools is a misallocation of resources.

        • pqtyw a day ago

          That's a bizarrely extreme position. For almost everyone ~$2000-3000 PC from several years ago is indistinguishable from one they can buy now from a productivity standpoint. Nobody is talking about $25 ten year old smartphones. Of course claiming that a $500 laptop is sufficient is also a severe exaggeration, a used desktop, perhaps...

        • jermaustin1 a day ago

          Overspending on your tools is a misallocation of resources. An annual $22k spend on computing is around 10-20x over spend for a wealthy individual. I'm in the $200-300k/year, self-employed, buys-my-own-shit camp, and I can't imagine spending 1% of my income on computing needs, let alone close to 10%. There is no way to make that make sense.

          • sneak 6 hours ago

            It’s not $22k/year, as the hardware still has great resale value when it’s replaced in 14-18 months.

            It’s less than $8-10k/year when all is said and done.

            I pay more for my car+insurance.

            • jcalvinowens 5 hours ago

              Look at it a different way: if you'd invested that $10K/year you've been blowing on hardware, how much more money would you have today? How about that $800/month car payment too?

        • antiframe a day ago

          Yes, you don't want to under spend on your tools to the point where you suffer. But, I think you are missing the flip side. I can do my work comfortably with 32GB RAM, but my 1% a year budget could get me more. But, why not pocket it.

          The goal is the right tool for the job, not the best tool you can afford.

    • dragonwriter 6 hours ago

      > Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum?

      Yes, you are crazy for saying that.

      > but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.

      There is a very wide range of incomes of people using their computers, and, more to the point, $5k/yr on hardware is way past the point where, for most people using their computer for income, additional hardware expenditure has any benefit to income generation.

      > Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.

      Most people using their computer to earn income do not earn anywhere close to $25,000/mo ($300k/yr), and hardware expenditures aren't the limiting factor holding them back.

      Also, the minimum of $5k/yr you suggested is not $250/mo, but more than 1.5× that at $417/mo.

      > Spend at least 1% of your gross revenue on your tools used to make that revenue.

      Median annual wage for a US software developer is ~$140k per most recent BLS numbers, and that's one of the higher-paying fields of work that people use computers for. Neither your original $5k per year nor even the $3k/year suggested by your later $250/mo suggestion are warranted by your 1% on tools rule for most people earning income with their computer, especially on hardware alone, as that is far from all of the "tools" that are relevant to most computer work.

    • kube-system a day ago

      I agree with the general sentiment - that you shouldn't pinch pennies on tools that you use every day. But at the same time, someone who makes their money writing with with a pen shouldn't need to spend thousands on pens. Once you have adequate professional-grade tools, you don't need to throw more money at the problem.

    • dghlsakjg a day ago

      If you are consistently maxing out your computers performance in a way that is limiting your ability to earn money at a rate greater than the cost of upgrades, and you can't offload that work to the cloud, then I guess it might make sense.

      If, you are like every developer I have ever met, the constraint is your own time, motivation and skills, then spending $22k dollars per year is a pretty interesting waste of resources.

      DOes it makes sense to buy good tools for your job? Yes. Does it make sense to buy the most expensive version of the tool that you already own last years most expensive version of? Rarely.

    • ChromaticPanic a day ago

      That's crazy spend for anyone making sub 100K

      • jermaustin1 a day ago

        It is crazy for anyone making any amount. A $15k desktop is overkill for anything but the most demanding ML or 3D work loads, and the majority of the cost will be in GPUs or dedicated specialty hardware and software.

        A developer using even the clunkiest IDE (Visual Studio - I'm still a fan and daily user, it's just the "least efficient") can get away without a dedicated graphics card, and only 32GB of ram.

        • sneak 6 hours ago

          No, it’s just a maxed out mac studio with 512gb unified ram. Nothing dedicated or specialty.

          Honestly a huge chunk of it is the Apple internal SSD tax but who wants to wait for usb3 external i/o?

      • red-iron-pine a day ago

        thats a crazy spend for sub-200k or even sub-500k

        you're just building a gaming rig with a flimsy work-related justification.

        • sneak 6 hours ago

          I have a different computer for games and rarely have time to play them.

    • hansvm a day ago

      Most people who use computers for the main part of their jobs literally can't spend that much if they don't want to be homeless.

      Most of the rest arguably shouldn't. If you have $10k/yr in effective pay after taxes, healthcare, rent, food, transportation to your job, etc, then a $5k/yr purchase is insane, especially if you haven't built up an emergency fund yet.

      Of the rest (people who can relatively easily afford it), most still probably shouldn't. Unless the net present value of your post-tax future incremental gains (raises, promotions, etc) derived from that expenditure exceeds $5k/yr you're better off financially doing almost anything else with that cash. That's doubly true when you consider that truly amazing computers cost $2k total nowadays without substantial improvements year-to-year. Contrasting buying one of those every 2yrs vs your proposal, you'd need a $4k/yr net expenditure to pay off somehow, somehow making use of the incremental CPU/RAM/etc to achieve that value. If it doesn't pay off then it's just a toy you're buying for personal enjoyment, not something that you should nebulously tie to revenue generation potential with an arbitrary 1% rule. Still maybe buy it, but be honest about the reason.

      So, we're left with people who can afford such a thing and whose earning potential actually does increase enough with that hardware compared to a cheaper option for it to be worth it. I'm imagining that's an extremely small set. I certainly use computers heavily for work and could drop $5k/yr without batting an eye, but I literally have no idea what I could do with that extra hardware to make it pay off. If I could spend $5k/yr on internet worth a damn I'd do that in a heartbeat (moving soon I hope, which should fix that), but the rest of my setup handily does everything I want it to.

      Don't get me wrong, I've bought hardware for work before (e.g., nobody seems to want to procure Linux machines for devs even when they're working on driver code and whatnot), and it's paid off, but at the scale of $5k/yr I don't think many people do something where that would have positive ROI.

      • hansvm 19 hours ago

        It's too late to edit, but I do have one more thought on the topic.

        From the perspective of an individual, ROI has to be large to justify a $5k/yr investment. HOWEVER, the general principle of "if something is your livelihood, then you should be willing to invest in it as appropriate" is an excellent thing to keep in mind. Moreover, at the scale of a company and typical company decisions the advice makes a ton of sense -- if a $1k monitor and $2k laptop allow your employees to context-switch better or something then you should almost certainly invest in that hardware (contrasted with the employee's view of ROI, the investments are tax-deductible and just have to pay off in absolute value, plus they don't have the delay/interaction with wages/promotions/etc introducing uncertainty and loss into the calculation) (the difference between a few hundred dollars and a few thousand dollars in total capital investment probably does have a huge difference in outcomes for a lot of computer-based employee roles).

    • neogodless a day ago

      Have you ever heard of the term "efficiency"?

      It's when you find ways to spend the minimum amount of resources in order to get the maximum return on that spend.

      With computer hardware, often buying one year old hardware and/or the second best costs a tiny fraction of the cost of the bleeding edge, while providing very nearly 100% of the performance you'll utilize.

      That and your employer should pay for your hardware in many cases.

    • nickjj a day ago

      I try to come at it with a pragmatic approach. If I feel pain, I upgrade and don't skimp out.

      ======== COMPUTER ========

      I feel no pain yet.

      Browsing the web is fast enough where I'm not waiting around for pages to load. I never feel bound by limited tabs or anything like that.

      My Rails / Flask + background worker + Postgres + Redis + esbuild + Tailwind based web apps start in a few seconds with Docker Compose. When I make code changes, I see the results in less than 1 second in my browser. Tests run fast enough (seconds to tens of seconds) for the size of apps I develop.

      Programs open very quickly. Scripts I run within WSL 2 also run quickly. There's no input delay when typing or performance related nonsense that bugs me all day. Neovim runs buttery smooth with a bunch of plugins through the Windows Terminal.

      I have no lag when I'm editing 1080p videos even with a 4k display showing a very wide timeline. I also record my screen with OBS to make screencasts with a webcam and have live streamed without perceivable dropped frames, all while running programming workloads in the background.

      I can mostly play the games I want, but this is by far the weakest link. If I were more into gaming I would upgrade, no doubt about it.

      ======== PHONE ========

      I had a Pixel 4a until Google busted the battery. It runs all of the apps (no games) I care about and Google Maps is fast. The camera was great.

      I recently upgraded to a Pixel 9a because the repair center who broke my 4a in a number of ways gave me $350 and the 9a was $400 a few months ago. It also runs everything well and the camera is great. In my day to day it makes no difference from the 4a, literally none. It even has the same storage space of which I have around 50% space left with around 4,500 photos saved locally.

      ======== ASIDE ========

      I have a pretty decked out M4 MBP laptop issued by my employer for work. I use it every day and for most tasks I feel no real difference vs my machine. The only thing it does noticeably faster is heavily CPU bound tasks that can be parallelized. It also loads the web version of Slack about 250ms faster, that's the impact of a $2,500+ upgrade for general web usage.

      I'm really sensitive to skips, hitches and performance related things. For real, as long as you have a decent machine with an SSD using a computer feels really good, even for development workloads where you're not constantly compiling something.

    • Krssst a day ago

      One concern I'd have is that if the short-term supply of RAM is fixed anyway, even if all daily computer users were to increase their budget to match the new pricing and demand exceeds supply again, the pricing would just increase in response until prices get unreasonable enough that demand lowers back to supply.

    • crote 21 hours ago

      Sorry, but that's delusional.

      For starters, hardware doesn't innovate quickly enough to buy a new generation every year. There was a 2-year gap between Ryzen 7000 and Ryzen 9000, for example, and a 3-year gap between Ryzen 5000 and Ryzen 7000. On top of that, most of the parts can be reused, so you're at best dropping in a new CPU and some new RAM sticks.

      Second, the performance improvement just isn't there. Sure, there's a 10% performance increase in benchmarks, but that does not translate to a 10% productivity improvement for software development. Even a 1% increase is unlikely, as very few tasks are compute-bound for any significant amount of time.

      You can only get to $15k by doing something stupid like buying a Threadripper, or putting an RTX 4090 into it. There are genuine use-cases for that kind of hardware - but it isn't in software development. It's like buying a Ferrari to do groceries: at a certain point you've got to admit that you're just doing it to show off your wealth.

      You do you, but in all honesty you'd probably get a better result spending that money on a butler to bring your coffee to your desk instead of wasting time by walking to the coffee machine.

    • iberator 21 hours ago

      Extremist point of view, and NOT optimal. Diminishing performance per $...

      Proper calculation is: cost/ performance ratio. Then buy a second from the list:)

    • ambicapter a day ago

      I don't spend money on my computers from a work or "revenue-generating" perspective because my work buys me a computer to work on. Different story if you freelance/consult ofc.

    • imtringued 6 hours ago

      Malagasy data annotators work for like $100 a month. You're pretty crazy to suggest that they should spend more on the hardware than they earn from it.

    • kotaKat a day ago

      I mean, as a frontline underpaid rural IT employee with no way to move outward from where I currently live, show me where I’m gonna put $5k a year into this budget out of my barren $55k/year salary. (And, mind you - this apparently is “more” than the local average by only around $10-15k.)

      I’m struggling to buy hardware already as it is, and all these prices have basically fucked me out of everything. I’m riding rigs with 8 and 16GB of RAM and I have no way to go up from here. The AI boom has basically forced me out of the entire industry at this point. I can’t get hardware to learn, subscriptions to use, anything.

      Big Tech has made it unaffordable for everyone.

      • zozbot234 a day ago

        8GB or 16GB of RAM is absolutely a usable machine for many software development and IT tasks, especially if you set up compressed swap to stretch it further. Of course you need to run something other than Windows or macOS. It's only very niche use cases such as media production or running local LLM's that will absolutely require more RAM.

        • pqtyw a day ago

          > something other than Windows or macOS > 8GB

          No modern IDE either. Nor a modern Linux desktop environment either (they are not that much more memory efficient than Macos or windows). Yes you can work with not much more than a text editor. But why?

      • imtringued 5 hours ago

        I assume those aren't US dollars? My suggestion is to go on a classifieds site and find a bargain there. You can find 2x8GB SODIMM DDR4 for like 20€ in Germany, because it's the default configuration for laptops and people are buying aftermarket RAM to upgrade to 2x16GB leaving a glut in 2x8GB configurations. Something similar happened to the desktop DIMMs but to a lesser extent because you can put four of them into a PC.

      • ecshafer a day ago

        The bright side is the bust is going to make a glut of cheap used parts.

      • sneak a day ago

        [flagged]

        • kotaKat a day ago

          Oh. I’m not allowed to own a home computer to try to further my own learning and education and knowledge then.

          Guess I’ll go fuck myself now then.

          • jfindper a day ago

            They're just using this comment section to brag about how well off they are, I wouldn't worry too much. They're completely out of touch.

            • bombcar a day ago

              It's the "how much can the banana cost, $10?" of HN.

              The point they're trying to make is a valid one - a company should be willing to spend "some money" if it saves time of the employee they're paying.

              The problem is usually that the "IT Budget" is a separate portion/group of the company than the "Salary" budget, and the "solution" can be force a certain dollar amount has to be spent each year (with one year carry-forward, perhaps) so that the employees always have good access to good equipment.

              (Some companies are so bad at this that a senior engineer of 10+ years will have a ten year old PoS computer, and a new intern will get a brand new M5 MacBook.)

jsheard a day ago

To be fair, Samsung's divisions having guns pointed at each other is nothing new. This is the same conglomerate that makes their own chip division fight for placement in their own phones, constantly flip-flopping between using Samsung or Qualcomm chips at the high end, Samsung or Mediatek chips at the low end, or even a combination of first-party and third-party chips in different variants of ostensibly the same device.

  • lkramer a day ago

    To be honest, this actually sounds kinda healthy.

    • dgemm a day ago

      It's a forcing function that ensures the middle layers of a vertically integrated stack remain market competitive and don't stagnate because they are the default/only option

    • _aavaa_ a day ago

      Sears would like to have a word about how healthy intra-company competition is.

      • marcosdumay a day ago

        Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.

        It makes absolutely no sense to apply the lessons from one into the other.

        • StableAlkyne a day ago

          I think what the GP was referring to was the "new" owner of Sears, who reorganized the company into dozens of independent business units in the early 2010s (IT, HR, apparel, electronics, etc). Not departments, either; full-on internal businesses intended as a microcosm of the free market.

          Each of these units were then given access to an internal "market" and directed to compete with each other for funding.

          The idea was likely to try and improve efficiency... But what ended up happening is siloing increased, BUs started infighting for a dwindling set of resources (beyond normal politics you'd expect at an organization that size; actively trying to fuck each other over), and cohesion decreased.

          It's often pointed to as one of the reasons for their decline, and worked out so badly that it's commonly believed their owner (who also owns the company holding their debt and stands to immensely profit if they go bankrupt) desired this outcome... to the point that he got sued a few years ago by investors over the conflict of interest and, let's say "creative" organizational decisions.

          • silisili a day ago

            This happened at a place where I worked years ago, but not as 'on purpose.' We were a large company where most pieces depended on other pieces, and everything was fine - until a new CEO came in who started holding the numbers of each BU under a microscope. This led to each department trying to bill other departments as an enterprise customer, who then retaliated, which then led to internal departments threatening to go to competitors who charged less for the same service. Kinda stupid how that all works - on paper it would have made a few departments look better if they used a bottom barrel competitor, but in reality the company would have bled millions of dollars as a whole...all because one rather large BU wanted to goose its numbers.

            • raw_anon_1111 2 hours ago

              Why is that a bad thing? If an internal department that’s not core to their business is less efficient than an external company - use the external company.

              Anecdote: Even before Amazon officially killed Chime, everyone at least on the AWS side was moving to officially supported Slack.

          • red-iron-pine a day ago

            to put a finer point on it, it wasn't just competition or rewarding-the-successful, the CEO straight up set them at odds with each other and told them directly to battle it out.

            basically "coffee is for closers... and if you don't sell you're fired" as a large scale corporate policy.

          • _aavaa_ a day ago

            Yes, this is what I was referring to. I should have provided more context, thanks for doing so.

          • marcosdumay a day ago

            That was a bullshit separation of a single horizontal cut of the market (all of those segments did consumer retail sales) without overlap.

            The part about no overlaps already made it impossible for them to compete. The only "competition" they had was in the sense of TV gameshow competition where candidates do worthless tasks, judged by some arbitrary rules.

            That has absolutely no similarity to how Samsung is organized.

        • reaperducer a day ago

          Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.

          Sears was hardly horizontal. It was also Allstate insurance and Discover credit cards, among other things.

          • marcosdumay a day ago

            Ok. And if it did divide on the borders of insurance and payment services, the reorganization wouldn't have been complete bullshit and may even have been somewhat successful.

    • itsastrawman a day ago

      The opposite, nepotism, is very unhealthy, so i think you're correct.

      • hammock a day ago

        Not sure that the opposite of transfer pricing is nepotism. As far as I know it’s far more common for someone who owns a lake house to assign four weeks a year to each grandkid , than to make them bid real money on it and put that in a maintenance fund or something. Though it’s an interesting idea, it’s not very family friendly

    • zoeysmithe a day ago

      n/a

      • crazygringo a day ago

        I genuinely can't tell if this is sarcasm? Or do you live somewhere where this is taught?

    • fransje26 a day ago

      Yeah, makes absolute sense.

      A bit like Toyota putting a GM engine in their car, because the Toyota engine division is too self-centered, focusing to much on efficiency.

      • cobalt60 a day ago

        You mean toyota putting bmw engine (supra). Your statement is contradicting as Toyota has TRD, which focuses on the track performance. They just couldn't keep up with the straight six perf+reliability when comparing to their own 2jz

        • Sohcahtoa82 16 hours ago

          > toyota putting bmw engine (supra).

          Or Toyota using a Subaru engine (Scion FRS, Toyota GT86)

        • Der_Einzige a day ago

          Buying a Supra is stupid. Either buy a proper BMW with the b58/Zf8 speed and get a proper interior or stop being poor and buy an LC500.

          Better yet, get a C8 corvette and gap all of the above for a far better value. You can get 20% off msrp on factory orders with C8 corvettes if you know where to look.

  • EMM_386 a day ago

    Isn't this how South Korean chaebols work?

    They operate with tension. They're supposed to have unified strategic direction from the top, but individual subsidiaries are also expected to be profit centers that compete in the market.

  • DavidPeiffer a day ago

    I worked with some supply chain consultants who mentioned "internal suppliers are often worse suppliers than external".

    Their point was that service levels are often not as stringently tracked, SLA's become internal money shuffling, but the company as a whole paid the price in lower output/profit. The internal partner being the default allows an amount of complacency, and if you shopped around for a comparable level of service to what's being provided, you can often find it for a better price.

  • Terr_ 15 hours ago

    I think this is a good time to reference a comic showing software companies' various "Org Charts", especially the one for Microsoft.

    https://goomics.net/62

  • morcus a day ago

    > two versions of the same phone with different processors

    That's hilarious, which phone is this?

    • petcat a day ago

      Basically every Galaxy phone comes in two versions. One with Exynos and one with Snapdragon. It's regional though. US always gets the Snapdragon phones while Europe and mostly Asia gets the Exynos version.

      My understanding is that the Exynos is inferior in a lot of ways, but also cheaper.

      • sgerenser a day ago

        In the past using Snapdragon CPUs for the U.S. made sense due to Qualcomm having much better support for the CDMA frequencies needed by Verizon. Probably no longer relevant since the 5G transition though.

    • muvlon a day ago

      Not one phone, they did this all over the place. Their flagship line did this starting with the Galaxy S7 all the way up to Galaxy S24. Only the most recent Galaxy S25 is Qualcomm Snapdragon only, supposedly because their own Exynos couldn't hit volume production fast enough.

      • numpad0 a day ago

        "Galaxy S II" and its aesthetics was already a mere branding shared across at least four different phones with different SoCs, before counting in sub-variants that share same SoCs. This isn't unique to Samsung, nor is it a new phenomenon, just how consumer products are made and sold.

        1: https://en.wikipedia.org/wiki/Samsung_Galaxy_S_II

      • noisem4ker a day ago

        The S23 too was Snapdragon only, allegedly to let the Exynos team catch some breath and come up with something competitive for the following generation. Which they partly did, as the Exynos S24 is almost on par with its Snapdragon brother. A bit worse on photo and gaming performance, a bit better in web browsing, from the benchmarks I remember.

    • grincek a day ago

      This is the case as recent as of S24, phones can come with exynos or snapdragon, with exynos usually featuring worse performance and battery life

    • intrikate a day ago

      I might be out of date, but last I knew, it was "most of them."

      International models tended to use Samsung's Exynos processors, while the ones for the North American market used Snapdragons or whatever.

    • namibj a day ago

      Several high end Galaxy S's AFAIK.

  • MagicMoonlight a day ago

    That’s really good business. Everyone is pushing to be the best rather than accepting mediocrity.

OhMeadhbh 19 hours ago

Or we could go back to using software that didn't require 1 Gb to run the OS / Browser combo powerful enough to run a browser that can load "too much" javascript to enable a webmail interface.

In the 80s I ran an early SMTP / POP email client that functioned as a DOS TSR with code and data less than 64k. Granted, it was pretty crappy and was text-only (non MIME.) But there's got to be a middle ground between 64k for a craptastic text-only email client and a 1Gb OS / Browser / Webmail combo that could probably run that DOS TSR in an emulator as an attachment to a short email.

  • knallfrosch 18 hours ago

    You have to ship software that behaves exactly the same across three desktop operating systems and two mobile operating systems. Be glad we leave out tablets for this. You're able to hire 4 vibecoding juniors and your deadline is in 6 months.

    You know what you choose for a frontend? It's Electron + React and React Native.

    And none of your customers will complain because the people with money to spend are rocking 12GB of RAM on their phone.

    • jbverschoor 17 hours ago

      The election apps are the same size as a complete VM of windows 2000.

    • OhMeadhbh 17 hours ago

      Yeah. That's not the world I live in. I live in the terminal.

      • joks 15 hours ago

        Congrats on being part of the 0.001% I guess?

  • benced 15 hours ago

    People optimize to what is scarce. Previously, developer time was much more scare than memory. Seems like that might change so folks might start to optimize to memory. It's not a moral judgement, it's a technical decision about where to allocate resources.

  • Aloisius 16 hours ago

    As long as javascript developers don't care about memory usage, I'm not sure what anyone can do.

    I mean, for goodness sake, an empty YouTube page with no videos eats up a shocking amount of memory - 90 MB just for the js heap. I used to run Windows 3.1 on a machine with 8 MB of RAM.

    Admittedly, a good amount of memory used with browsers is because of larger graphics buffers that go along with higher resolution monitors, but much is just... waste.

  • cyanydeez 18 hours ago

    Yo, your comment makes it seem like AI deserves this memory and chrome does not...

    • estimator7292 18 hours ago

      They're saying the amount of bloat in modern software is so ridiculous that it requires multiple gigabytes of memory to run a single application that, in a sane universe, shouldn't occupy more than a hundred MB.

      AI doesn't deserve it more than we do, but also we shouldn't be required to have $300 in RAM for basic functionality. We shouldn't have to deal with RAM scalpers because businesses don't want to develop good software.

      Instead, we the users are forced to pay for more and more memory and CPU and disk because some rich asshole doesn't want to spend the money on developing good software. The costs are pushed to us. And since resources are now unimaginably expensive, it's still our problem and we still have to foot the bill a million times over.

      • ponector 16 hours ago

        The thing is: most people don't want to pay for good software. For any software.

        And on top of that bug fixes and efficiency gains are never a top priority, only new features and redesign are pushed forward.

        • cyanydeez 15 hours ago

          Explains AI perfectly.

          Bizarre how some commentors have rose colored glasses, like AI isn't exponentially bloat.

      • cyanydeez 16 hours ago

        Yo, are you saying AI is not the epitome of bloat? Like, it's not a cancer of epic proportions. Very confused..

        Wat?

        • hug 14 hours ago

          LLMs use a lot of RAM as a fundamental part of their operation. The RAM is used to achieve the goal as efficiently as we know how. Even if you disagree with the goal needing to be achieved at all, the RAM usage is about as efficient as we can design.

          Regular modern applications use a lot of RAM as an incidental or accidental part of their operation. Even if you think the tasks that they're achieving are of extreme need, the RAM use is excessive.

          These problems are apples and oranges. You can hate both, or one, or neither. I know plenty of people who are in each one of those camps.

          • cyanydeez 8 hours ago

            Chrome fundamentally uses ram ato avoid. What?

            What a weird LLM apologetics.

            • hug 7 hours ago

              If you don’t think Chrome could be way more RAM efficient, and especially if you don’t think the things running inside Chrome could be more efficient, I have a bridge to sell you.

              If you think acknowledging that fact (and the fact that there’s really not a great way to make LLMs more efficient) is “apologetics”, I cannot engage with you in good faith.

              • cyanydeez 4 hours ago

                Ok, so LLMs cant be more wfficient....wat

    • OhMeadhbh 17 hours ago

      And I for one welcome our new AI overlords!

      Seriously though... I think most uses of LLMs are pretty stupid, but it seems like we're in the bubble and the only way people can continue to make money is by doubling down on AI spending. Or at least that's the only way they think they can make money.

      So... sorry for leaving you with that impression. Maybe the only way to get to the post AI hype world is to give AI companies everything they want so they fail faster.

      "Yes, the planet got destroyed. But for a beautiful moment in time we created a lot of value for shareholders!" ( see https://economicsociology.org/2014/10/07/yes-the-planet-got-... )

      Does anyone deserve RAM though?

      • cyanydeez 16 hours ago

        I was just pointing out the weirdness of AI bloat and complaint about application bloat....it's not irony...it's like, the zeitgeist.

        • OhMeadhbh 16 hours ago

          lol. yes. I am behind the times complaining about app bloat. I should be complaining about LLM bloat. But I have enough bile in me to complain about both.

          • cyanydeez an hour ago

            well, one of them just demands you get another 32GB stick of ram. The other wants you to pay for it's high voltage power lines, give it millions of gallons of water and provide zero benefits to the local community.

            If bloat was given human form, it's definitely LLMs and their corporate financiers.

khannn a day ago

"The price of eggs has nothing on the price of computer memory right now.". A dozen eggs went to ~$5. They are eggs and most people use what, max 12 eggs a month? Get out of here with that trite garbage. Everyone knew that the egg shortage was due to the extreme step the US does of culling herds infected with avian flu and that they were transitory.

  • 542458 a day ago

    Surprisingly, apparently Americans average 279 eggs per year per person or 24 per month.

    https://www.washingtonpost.com/business/2019/02/28/why-ameri...

    (This is not a comment making any judgements about cost or the state of the economy, I was just surprised to find it that high)

    • red-iron-pine a day ago

      cuz eggs are in breakfast sandwiches, are ingredients in pastries, act as binders in things like meatloaf or fried chicken, etc. etc.

    • silisili a day ago

      That sounded high to me as well(probably because I rarely eat eggs), but then I remembered my parents who each eat two per day which isn't that uncommon I guess.

    • baud147258 a day ago

      Maybe if you include all the eggs in processed food like cookies or cakes and in restaurants or other catering operations you reach that number? And eggs consumed at home could still be around 12 per person?

  • xboxnolifes a day ago

    The average person buys, what, 0 ram per month? Which cares.

    • khannn a day ago

      The average person buys a phone amortized at 36 months minus trade-in value. So they do indeed buy ram every month but it's a line item on a phone bill.

      • xboxnolifes 20 hours ago

        Assuming an 8GB phone on average and 2x16GB DDR5 desktop sticks being ~$400, the average person then buys 0.25GB RAM per month at $3.125.

        If you want, you can add in a 16GB laptop every 36 months, tripling the total to 0.75GB and ~$10 a month. Still, that's multiple times less than the increase in egg price compared to the average consumption.

        • khannn 19 hours ago

          Apples and oranges comparison. RAM works forever while eggs only keep someone full for 4 to 6 hours. I'd honestly like to see the amount of time someone is full from eating eggs vs the average daily screen time vs the cost of both, lets say the service life of the phone is 36 months with the cost of the eggs averaged out for that three year period.

  • n8cpdx a day ago

    Eggs have traditionally been an extremely cheap protein staple.

    A typical pattern might be to have two eggs for breakfast (a whopping 120 calories), boiled eggs for lunch/snack (another 60-120 calories), and of course baking, but I will pretend that people don’t bake.

    A more typical serving for an adult breakfast might be 3 eggs if not supplemented.

    For mom and dad and the little one, you’re now at 35 (2+2+1+2)x5 eggs per week. When your cost goes from $6 (2x18 @3) to $16 (2x18@8) per week, you notice.

    Obviously the political discourse around this was not healthy. But eggs suddenly becoming a cost you have to notice is a big deal, and a symbol for all of the other grocery prices that went up simultaneously.

    If you’re a typical HN user in the US you might be out of touch with the reality that costs going up $10/week can be a real hardship when you’re raising a family on limited income.

    The peak was actually closer to $8/dozen, my math has been conservative at every step, the situation is worse than I describe.

    • khannn a day ago

      Parents in the US don't feed their kids eggs for breakfast, it's majority cereal or breakfast bars. Maybe some yogurt but that's almost always upper middle class or above.

      "If you’re a typical HN user in the US you might be out of touch with the reality that costs going up $10/week can be a real hardship when you’re raising a family on limited income.".

      Skill issue. Oatmeal is very cheap and filling. The aforementioned yogurt. Nothing, yeah nothing, because the average person is obese here and nothing is exactly what they need for breakfast. A piece of fruit like the perennial classic banana for breakfast. Complaining about egg prices comes from the camp of "I tried nothing and nothing worked".

      • hombre_fatal a day ago

        I agree, but for some reason there's huge mental inertia to the foods we eat day to day.

        Paying more for staples that you've eaten your whole life (especially in a boiled frog way) is much more time/energy/mentally cheaper than experimenting with how you and your kids might like a bowl of oatmeal prepared.

        That said, if you're having trouble making ends meet and you have kids, you don't have much of a choice.

      • et-al 20 hours ago

        Aside from yoghurt, you’ve only listed carbs. Sure oatmeal has protein (and fiber), but not as much as eggs.

      • bobsmooth 20 hours ago

        The quintessential out of touch HN comment.

        • khannn 19 hours ago

          I have friends with kids, have siblings with kids, and indeed did grow up in the US. Ate cereal growing up with maybe some eggs on the weekend. My siblings feed their kids exactly what I described. My friends feed their kids the same. I have no idea how that is out of touch, but I grew up lower-middle class and that's my lived experience.

          • n8cpdx 18 hours ago

            I grew up similarly but regularly had eggs for breakfast, at least some of the time. Usually on toast. When eggs are cheap, that is competitive with cereal or pop tarts.

            I would have hoped that better access to nutrition information would have led to parents making better choices. Absolutely insane that they’re still choosing desserts for breakfast every day instead of high quality Whole Foods like eggs.

          • bobsmooth 19 hours ago

            It's more how you said it than what you said.

            • khannn 17 hours ago

              Please, please, comment on one account and I'm sorry I hurt your fee fees

      • stuffn 19 hours ago

        > Complaining about egg prices comes from the camp of "I tried nothing and nothing worked".

        Eggs are one of the highest protein-per-calorie, nutrient dense foods you can purchase. Up until recently it was cheaper than almost any other staple. When I was growing up (admittedly during a time everything was relatively cheap) my family ate a lot of eggs. We had spreads, we had eggs for breakfast, and eggs were incorporated into dinners in one way or another. I'm not the only one. I don't know anyone born in my cohort that didn't eat eggs regularly.

        > Oatmeal is very cheap and filling

        Also completely devoid of the same level of nutrition as eggs and requires supplementation.

        > it's majority cereal or breakfast bars.

        While true this is an education issue not a cost issue. We still have at least 3 generations of people having children that were raised in the "eggs are horrible for you" times, including myself.

        > Nothing, yeah nothing, because the average person is obese here and nothing is exactly what they need for breakfast.

        The average person is obese because of the relative ease of cheap, high calorie, fillers and good options being more expensive. The price of eggs increasing compounds this. However, I would wager most adults are obese because of the high calorie starbucks, fast food, and snacks. Not because of cereal for breakfast.

        > A piece of fruit like the perennial classic banana for breakfast.

        Demonstrably worse for you than both cereal and eggs. Once again, defeating your point and STILL demonstrating more expensive eggs makes nutritionally worse options the only option.

  • th0ma5 a day ago

    There was also a lot of profiteering going on? This was talked about quite a bit? And it's still going on in other markets with other things like cars??

    • khannn a day ago

      "Profiteering"? Truth is... the game was rigged from the start

    • venturecruelty a day ago

      Sorry, we have to starve so the two dairy distributors can have another good quarter. I hear gruel is cheap, for now.

rafaelmn a day ago

Apple is going to be even more profitable in the consumer space because of RAM prices ? I feel like they are the only player to have the supply chain locked down enough to not get caught off guard, have good prices locked in enough in advance and suppliers not willing to antagonize such a big customer by backing out of a deal.

  • londons_explore a day ago

    Apple software typically seems to give a better user experience in less RAM in both desktop and mobile.

    For the last 10+ years apples iPhones have shipped with about half the ram of a flagship android for example.

    • Miraste a day ago

      They used to, but they've caught up. The flagship iPhone 17 has 12GB RAM, the same as the Galaxy S25. Only the most expensive Z Fold has more, with 16GB.

      RAM pricing segmentation makes Apple a lot of money, but I think they scared themselves when AI took off and they had millions of 4GB and 8GB products out in the world. The Mac minimum RAM specs have gone up too, they're trying to get out of the hole they dug.

    • fennecbutt a day ago

      People always make this argument. But could you please expand on what you think is actually in memory?

      code:data, by and large I bet that content held in ram takes up the majority of space. And people have complained about the lack of ram in iPhones for ages now, particularly with how it affects browsers.

  • Dibby053 a day ago

    >the only player to have the supply chain locked down enough to not get caught off guard What?

    • Miraste a day ago

      Tim Cook is the Supply Chain Guy. He has been for decades, before he ever worked at Apple. He does everything he can to make sure that Apple directly controls as much of the supply chain as possible, and uses the full extent of their influence to get favorable long-term deals on what they don't make themselves.

      In the past this has resulted in stuff like Samsung Display sending their best displays to Apple instead of Samsung Mobile.

ralferoo 19 hours ago

I'm not surprised this is happening in a massive group of companies. I once worked for a smallish company with 400 staff, with basically one product, but different teams - backend, frontend, database, QA, sales, etc... If you were ever speaking to someone from a different team for more than 15 minutes, it had to be put into your weekly timesheet so your time could be cross-billed to the other department.

Obviously the net effect was to silo all the different departments so that nobody really knew how the entire product worked, except the few smokers who'd regularly go outside and smoke for 15+ minutes and chat to whoever else was around.

browningstreet 19 hours ago

Apple has the opportunity to do something really funny and radically increase the base RAM configurations of all their unified memory/CPU/GPU chips. Intel/AMD builders would struggle to meet the price/capacity points.

  • koolala 19 hours ago

    Don't they have to pay the same crazy prices? What gives them the opportunity?

    • dontlaugh 19 hours ago

      They already overprice their memory, so they have a lot more headroom.

Shank a day ago

This is going to be a serious problem. We’ve had smart devices percolate through all consumer electronics, from washing machines to fridges. That’s all fine and dandy but they all need RAM. At what point does this become a national security issue? People need these things and they all require RAM and now assumably will cost more as the raw chip cost increases significantly or the supply chains dry up for lower quantities all together.

fennecbutt a day ago

Well well well. From an anti monopoly standpoint isn't it interesting that each business is doing what it should for its own best interests rather than special deals because they're under the same umbrella?

  • venturecruelty a day ago

    I love how I've seen a bunch of responses that amount to "do you really need that much RAM anyway?" Unreal.

monster_truck a day ago

Based on my time working for Samsung this does not surprise me. The silos within fight against one another more than they ever bother to compete with anyone else

rigrassm 12 hours ago

I am so glad when I decided to upgrade my AM4 system a few years back to a 5800X (3D model at the time I believe was just released and was questionable at the time if it was worth it outside of gaming) I wanted to build a PC from my hand me downs for my kid so I went ahead and upgraded my storage and RAM to a decent 64GB DDR4 set that ended up with a decent stable overclock.

Prices weren't great when I bought it at the time but compared to now, I'm glad I bit the bullet when I did.

time4tea a day ago

Dec 2023:

96GB (2x48) DDR5 5x00 £260 today £1050

128GB (4x32 ) DDR5 5x00 £350 today £1500

Wut?

Edit: formatting

  • tehlike 4 hours ago

    Just earlier this year I pad 35$ for 64gb lrdimm stick (420$ total for 12).

    Now each stick is over 180$.

  • dehrmann a day ago

    ECC memory has been one of my better investments in the past two years, and now because of the crashes it might have prevented.

  • bpye a day ago

    Kind of wish I went for 2x48GB last year, not 2x32GB. Oh well.

aceazzameen a day ago

I really wanted to build a new PC this year, which is obviously not happening anymore. But I do have 2x16GB DDR5 SODIMMs from my laptop that I'm not using, after I upgraded to 64GB a while back. Now I wonder if I can build a tiny PC around those? Does anyone make motherboards that support DDR5 laptop memory?

  • max-leo a day ago

    Minisforum offer a Mini-ITX board with a 16-core Zen4 AMD CPU soldered on for under $400. The AM5 socket version of that same CPU alone is over 500. It uses SO-DIMM DDR5 so might be an interesting option in your case. (Yes, it is a mobile CPU but it has the same amounts of L2/L3 Cache as the AM5 chip, just clocked 300MHz slower)

    https://store.minisforum.com/products/minisforum-motherboard

  • ineedasername a day ago

    A bunch of the NUC models use laptop RAM, and often have barebones kits. Looks like ASUS has a decent range of kits and prebuilt, but you may be able to find boards. If you want something expandable, look for the "Pro" and "Extreme" range. I had one of the first gaming-oriented NUC's a while back, Hades Canyon, highly capable.

  • ThatPlayer 20 hours ago

    There are adapters that convert the laptop memory for desktop motherboards. So that's an option too.

tippa123 a day ago

This is to be expected from any large corporation. In my experience, this sort of infighting leads to low morale and wastes a significant amount of energy that could be directed somewhere far more productive.

Barathkanna a day ago

When RAM gets so expensive that even Samsung won’t buy Samsung from Samsung, you know the market has officially entered comic mode. At this rate their next quarterly report is just going to be one division sending the other an IOU.

  • Ericson2314 a day ago

    Overleverage / debt, and refusing to sell at a certain price, are actually very different things though. OpenAI might be a tire fire, but Samsung is the gold pan seller here, and presumably has an excellent balance sheet.

qwertox a day ago

I had planned to build a new workstation this fall, all the parts were in the list. but seeing the ram go from 300€ (96 GB) to 820€, in-stock for 999€, in under a month made me decide that i will continue using that laptop from 2019 for maybe another 1.5 years.

It's a ridiculous situation and these companies, whoever they are, should be somewhat ashamed of themselves for the situation they're putting us in.

That goes specially for those MF at OpenAI who apparently grabbed 40% of the worldwide DRAM production, as well as those sold in stores.

awongh a day ago

This seems to be for chips put in phones in 2026? I thought these orders were booked further in advance, or is that only for processors?

me551ah a day ago

It is absolutely the worst time to be a gamer. First it was the GPU prices that went up and NVIDIA started to focus on their enterprise cards more and more RAM prices. I don’t think I’ve seen the price of computer components go up so much.

itopaloglu83 a day ago

The manufacturers are willing to quadruple the prices for the foreseeable future but not change their manufacturing quotes a bit.

So much for open markets, somebody must check their books and manufacturing schedules.

  • dgacmu a day ago

    In their defense, how many $20 billion fabs do you want to build in response to the AI ... (revolution|bubble|other words)? It seems very, very difficult to predict how long DRAM demand will remain this elevated.

    It's dangerous for them in both directions: Overbuilding capacity if the boom busts vs. leaving themselves vulnerable to a competitor who builds out if the boom is sustained. Glad I don't have to make that decision. :)

    • itopaloglu83 a day ago

      I don’t think they’re working at 100% capacity or don’t have any other FAB that they can utilize for other low profit stuff.

      Let’s check their books and manufacturing schedule to see if they’re artificially constraining the supply to jack up the prices on purpose.

      • dgacmu a day ago

        I'd take the opposite bet on this. They're diverting wafer capacity from lower-profit items to things like HBM, but all indications are that wafer starts are up a bit. Just not up enough.

        For example: https://chipsandwafers.substack.com/p/mainstream-recovery

        "Sequentially, DRAM revenue increased 15% with bit shipments increasing over 20% and prices decreasing in the low single-digit percentage range, primarily due to a higher consumer-oriented revenue mix"

        (from june of this year).

        The problem is that the DRAM market is pretty tight - supply or demand shocks tend to produce big swings. And right now we're seeing both an expected supply shock (transition to new processes/products) as well as a very sudden demand shock.

        • itopaloglu83 19 hours ago

          So it’s also the perfect time to constrain the product flow to jack up the prices.

          They’ve been acting like a cartel for a long time now and somehow they never match the demand even after 18 months straight price increases. They already have the fab, the procedures, and everything, so stop acting like they’re setting up a brand new fab just to increase throughput.

          • dgacmu 16 hours ago

            This seems like a weird subject on which to be so aggressive, or at least I'm interpreting your tone that way. DRAM manufacturers absolutely have engaged in illegal price fixing in the past (1998-2002 in particular). But they've also overbuilt and underbuilt in fairly regular cycles, resulting in large swings in dram price and profitability. And they've had natural disasters reduce production capacity (e.g., micron in 2021). But there's no evidence right now that this is anything except finding themselves in the nice (but nervous) position of making a product that just just had a major demand spike, combined with some clever contract work by openai.

            Demand right now is so high that they'd make more net profit if they could make more dram. They could still be charging insane prices. They're literally shutting down consumer sales - that's completely lost profit.

      • fullstop a day ago

        > I don’t think they’re working at 100% capacity or don’t have any other FAB that they can utilize for other low profit stuff.

        I have a family member who works in a field related to memory and storage fabrication. At the moment Micron, etc, are running these money printers full time and forgoing routine maintenance to keep the money flowing.

        • itopaloglu83 19 hours ago

          What I said stands, let’s check their books and manufacturing schedule to see if they’re artificially constraining the supply.

          The fact that they’re busy doesn’t hide the fact that they’re known to collude before, and they might even ship parts to phony resellers to keep the price high.

          What’s next? A commodity memory chip is going to cost more than a cpu or gpu die?

  • davey48016 a day ago

    Most of the things people say about efficient markets assume low barriers to entry. When it takes years and tens of billions of dollars to add capacity, it makes more sense to sit back and enjoy the margins. Especially if you think there's a non-trivial possibility that the AI build out is a bubble.

    • itopaloglu83 19 hours ago

      We only need the memory manufacturers to not collude with each other, not even external pressure.

      You want to tame their cartel like behaviors? Just get into their books and it would be clear as day if they’re artificially constraining the supply, and I’m not even talking about spending extra billions.

      You cannot manufacture something that modern life depends on and not get government scrutiny.

  • filloooo a day ago

    Memory chips have always been a very cyclical business, that's why their stock prices remain relatively low despite a windfall happening.

    • itopaloglu83 19 hours ago

      A commodity hardware that’s on the price decline for decades just quadruples in price and nobody makes any form of long term investment or even contracts to take advantage of the situation? It’s more likely to be a collusion than not.

  • arijun a day ago

    If it’s an AI bubble, it would be stupid to open new manufacturing capacity right now. Spend years and billions spinning up a new fab, only to have the bottom of the market drop out as soon as it comes online.

    • itopaloglu83 19 hours ago

      Assuming that opening a new fab is the only way to match the demand is simply asinine.

      You can ramp up production in limited capacity, make long term contracts, or pass the manufacturing rust to the buyer. When we needed a vertical stabilizer for a legacy aircraft we paid for an entire production like to be built just to manufacture two tails, so there are tons of ways to do this if you want to be competitive. But instead this is a cartel like market where manufacturers colluded before, so they’re more likely to collude than spend billions doing anything.

      Just open their books and schedules with a competent auditors and see if they’re artificially manipulating things or not.

potato3732842 a day ago

You make more money selling the good stuff. It's like this in just about every industry.

  • venturecruelty 21 hours ago

    Why bother selling to regular consumers at all then? One or two big companies can have everything, and the rest of us can have nothing. And we will like it.

Night_Thastus a day ago

I am sooooooooooooooooooooooo glad I bought a 6000Mhz 2x16 kit before all this nonsense started.

I'll be honest, I have 0 confidence that this is a transient event. Once the AI hype cools off, Nvidia will just come up with something else that suddenly needs all their highest end products. Tech companies will all hype it up, and suddenly hardware will be expensive again.

The hardware manufacturers and chip designers have gotten a taste of inflated prices and they are NOT going to let it go. Do not expect a 'return to normal'

Even if demand goes back to exactly what it what, expect prices to for some reason be >30% higher than before for no reason - or as they would call it 'market conditions'.

blindriver a day ago

I bought 64 GB DDR4 RAM for $189 in 2022. The exact same memory is now almost $600 on Amazon. How can this not impact PC sales and the sale of other electronics?

  • layer8 a day ago

    It will. Manufacturers who didn’t get good supply contracts in time might be forced to leave the market.

DustinBrett a day ago

Ironically that site was eating up my RAM. PC World has some issues, Chrome & Firefox.

meindnoch a day ago

I've bought 2x16GB Samsung ECC RAM last week for $150.

rwyinuse a day ago

I'm now glad I bought 128GB of DDR4 when building a new dual purpose server-gaming PC two years ago. The RAM is now worth way more than the rest of the parts combined.

I wonder how this will impact phone prices.

SanjayMehta a day ago

In the 90s, Motorola Mobile used Cypress SRAMs and not Motorola SRAMs.

Pricing.

dreamcompiler a day ago

Once the AI bubble pops there will be smoking deals on RAM (and everything else).

dcchambers 19 hours ago

Seriously concerned for the future of consumer electronics right now.

Next up: Nvidia exits the consumer hardware space and shifts fully to datacenter chips.

DocTomoe a day ago

I feel we have a RAM price surge every four years. The excuses change, but it's always when we see a generation switch to the next gen of DDR. Which makes me believe it's not AI, or graphics cards, or crypto, or gaming, or one of the billion other conceivable reasons, but price-gouging when new standards emerge and production capacity is still limited. Which would be much harder to justify than 'the AI/Crypto/Gaming folks (who no-one likes) are sweeping the market...'

  • muvlon a day ago

    But we're not currently switching to a next gen of DDR. DDR5 has been around for several years, DDR6 won't be here before 2027. We're right in the middle of DDR5's life cycle.

    That is not to say there is no price-fixing going on, just that I really can't see a correlation with DDR generations.

  • JKCalhoun a day ago

    Regardless of whether it is Crypto/AI/etc., this would seem to be wake-up call #2. We're finding the strangle-points in our "economy"—will we do anything about it? A single fab in Phoenix would seem inadequate?

    • jacquesm a day ago

      If 'the West' would be half as smart as they claim to be there would be many more fabs in friendly territory. Stick a couple in Australia and NZ too for good measure, it is just too critical of a resource now.

      • jack_tripper 20 hours ago

        The west is only smart at financial engineering (printing money to inflate stocks and housing). Anything related to non-military manufacturing should be outsourced to the cheapest bidder to increase shareholder value.

    • fullstop a day ago

      Micron is bringing up one in Boise Idaho as well.

    • baiwl a day ago

      What will we do with that fab in two years when nobody needs that excess RAM?

      • jacquesm a day ago

        There has never been 'an excess of RAM', the market has always absorbed what was available.

        • jack_tripper 20 hours ago

          Yeah right, tell that to Qimonda.

      • Ericson2314 a day ago

        Sell it at lower prices. Demand is a function of price, not a scalar.

        • h2zizzle a day ago

          Tax write-off donations to schools and non-profits, too.

      • JKCalhoun a day ago

        I suspect there will be a shortage of something else then…

        And regardless, you could flip it around and ask, what will we do in x years when the next shortage comes along and we have no fabs? (And that shortage of course could well be an imposed one from an unfriendly nation.)

    • xzjis a day ago

      It's a political problem: do we, the people, have a choice in what gets prioritized? I think it's clear that the majority of people don't give a damn about minor improvements in AI and would rather have a better computer, smartphone, or something else for their daily lives than fuel the follies of OpenAI and its competitors. At worst, they can build more fabs simultaneously to have the necessary production for AI within a few years, but reallocating it right now is detrimental and nobody wants that, except for a few members of the crazy elite like Sam Altman or Elon Musk.

  • jacquesm a day ago

    Why is this downvoted, this is not the first time I've heard that opinion expressed and every time it happens there is more evidence that maybe there is something to it. I've been following the DRAM market since the 4164 was the hot new thing and it cost - not kidding - $300 for 8 of these which would give you all of 64K RAM. Over the years I've seen the price surge multiple times and usually there was some kind of hard to verify reason attached to it. From flooded factories to problems with new nodes and a whole slew of other issues.

    RAM being a staple of the computing industry you have to wonder if there aren't people cleaning up on this, it would be super easy to create an artificial shortage given the low number of players in this market. In contrast, say the price of gasoline, has been remarkably steady with one notable outlier with a very easy to verify and direct cause.

    • sharpshadow a day ago

      There is also the side effect of limiting people to run powerful models themselves. Could very well be part of a strategy.

      • blurbleblurble 10 hours ago

        It's absolutely part of the strategy and the strategy has multiple prongs. Another prong is this obnoxious push for regulatory capture in the name of "safety".

maxglute a day ago

Kdrama on this when?

venturecruelty a day ago

It's unfortunate that we will soon not have computers because it is not profitable enough. Alas. Too bad the market is so efficient.

  • dmix 21 hours ago

    I'd much rather be in a country where the odd temporary shortage happens due to a massive new market appearing than one where supply/demand is always fixed and static because nothing new gets built without extreme careful planning.

    AI is not going away, but there will be a correction and things will plateau to a new higher level of demand for chips and go back to normal as always. There's too much money involved for this not to scale up.

    Markets can't adapt overnight to tons of data centers being built all of a sudden but it will adapt.

shevy-java a day ago

AI companies must compensate us for this outrage.

A few hours ago I looked at the RAM prices. I bought some DDR4, 32GB only, about a year or two ago. I kid you not - the local price here is now 2.5 times as it was back in 2023 or so, give or take.

I want my money back, OpenAI!

  • h2zizzle a day ago

    This is important to point out. All the talk about AI companies underpricing is mistaken. The costs to consumers have just been externalized; the AI venture as a whole is so large that it simply distorts other markets in order to keep its economic reality intact. See also: the people whose electric bills have jumped due to increased demand from data centers.

    I think we're going to regret this.

    • amarcheschi a day ago

      Americans are subsidizing ai by paying more for their electricity for the rest of the world to use chatgpt (I'm not counting the data centers of Chinese models and a few European ones though)

  • Uvix a day ago

    DDR4 manufacturing is being spun down due to lack of demand. The prices on it would be going up regardless of what's happening with DDR5.

  • Forgeties79 a day ago

    I am so glad I built my PC back in April. My 2x16gb DDR5 sticks cost $105 all in then, now it’s $480 on amazon. That is ridiculous!

    • basscomm a day ago

      I'm also glad I overbought RAM when I did my last PC upgrade in January, because who knows when I'll be able to do that again.

      The 96GB kit I bought (which was more than I needed) was $165. I ended up buying another 96GB kit in June when I saw the price went up to $180 to max out my machine, even though I didn't really need it, but I was concerned where prices were going.

      That same kit was $600 a month ago, and is $930 today. The entire rest of the computer didn't cost that much

      • Forgeties79 a day ago

        Yeah I do regret not going 64GB when it was so cheap but honestly? 32 has been fine. I had already pushed the budget to future-proof critical things (mobo, PSU, CPU, etc.) and ram hopefully one day will drop to sane prices again. I doubt I'll feel the strain for 3-5 years if at all. It's mainly a gaming rig right now

  • toss1 a day ago

    Yup.

    And even more outrageous is the power grid upgrades they are demanding.

    If they need the power grid upgraded to handle the load for their data centers, they should pay 100% of the cost for EVERY part of every upgrade needed for the whole grid, just as a new building typically pays to upgrade the town road accessing it.

    Making ordinary ratepayers pay even a cent for their upgrades is outrageous. I do not know why the regulators even allow it (yeah, we all do, but it is wrong).

    • moregrist a day ago

      Usually the narrative for externalizing these kinds of costs is that the investment will result in lots of jobs in the upgrade area.

      Sometimes that materializes.

      Here the narrative is almost the opposite: pay for our expensive infrastructure and we’ll take all your jobs.

      It’s a bit mind boggling. One wonders how many friends our SV AI barons will have at the end of the day.

    • fullstop a day ago

      I bought 2x16 (32GB) DDR4 in June for $50. It is now ~$150.

      I'm kicking myself for not buying the mini PC that I was looking at over the summer. The cost nearly doubled from what it was then.

      My state keeps trying to add Data Centers in residential areas, but the public seems to be very against it. It will succeed somewhere and I'm sure that there will be a fee on my electric bill for "modernization" or some other bullshit.

  • bell-cot a day ago

    The problem is further upstream. Capitalism is nice in theory, but...

    "The trouble with capitalism is capitalists; they're too damn greedy." - Herbert Hoover, U.S. President, 1929-1933

    And the past half-century has seen both enormous reductions in the regulations enacted in Hoover's era (when out-of-control financial markets and capitalism resulted in the https://en.wikipedia.org/wiki/Great_Depression), and the growth of a class of grimly narcissistic/sociopathic techno-billionaires - who control way too many resources, and seem to share some techno-dystopian fever dream that the first one of them to grasp the https://en.wikipedia.org/wiki/Artificial_general_intelligenc... trophy will somehow become the God-Emperor of Earth.

    • nyeah 20 hours ago

      It'll be fine.