• Excrubulent@slrpnk.net
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    1 year ago

    So you’ve vaguely waved your hands in the direction of innovations that you think are different now than in the 1970s but not explained how they’re different or where those innovations came from.

    You aren’t actually pointing to any serious innovations silicon valley have done.

    Modern device development consists of more than gluing a bunch of APIs together, but it largely does consist of that.

    Apple maintains those things not for innovation purposes, but so they can keep a walled garden. If they maintain objective C and iOS and MacOS on their own terms then they can keep people locked into their ecosystem and overcharge them for devices they will then overcharge for repairs in order to upsell people into the next model. They are notorious for this shitty behaviour. It’s not real innovation.

    And when you say wireless is straight up black magic… you mean it’s a real technology that was developed by researchers, not capitalists, because real R&D is expensive, so capitalism socialises the costs and privatises the rewards.

    • AlotOfReading@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      I haven’t explained what the differences are because almost everything is different. It’s like comparing a Model T to a Bugatti. They’re simply not built the same way, even if they both use internal combustion engines and gearboxes.

      Let me give you an overview of how the research pipeline occurs though. First is the fundamental research, which outside of semiconductors is usually funded by public sources. This encompasses things like methods of crack formation in glasses, better solid state models, improved error correction algorithms and so on. The next layer up is applied research, where the fundamental research is applied to improve or optimize existing solutions / create new partial solutions to unsolved problems. Funding here is a mix of private and public depending on the specific area. Semiconductor companies do lots of their own original research here as well, as you can see from these Micron and TSMC memory research pages. It’s very common for researchers who are publicly funded here to take that research and use it to go start a private company, usually with funding from their institution. This is where many important semiconductor companies have their roots, including TSMC via ITRI. These companies in turn invest in product / highly applied research aimed at productizing the research for the mass market. Sometimes this is easy, sometimes it’s extremely difficult. Most of the challenges of EUV lithography occurred here, because going from low yield academic research to high yield commercial feasibility was extremely difficult. Direct investment here is almost always private, though there can be significant public investments through companies. If this is published (it often isn’t), it’s commonly done as patents. Every company you’ve heard of has thousands of these patents, and some of the larger ones have tens or hundreds of thousands. All of that is the result of internal research. Lastly, they’ll take all of that, build standards (e.g. DDR5, h.265, 5G), and develop commercial implementations that actually do those things. That’s what OEMs buy (or try to develop on their own in the case of Apple modems) to integrate into their products.

      • Excrubulent@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        So you’ve admitted that all primary research is done in the public sector, because of course it is. Exploratory primary research is not guaranteed to be profitable. That leaves only what is relatively guaranteed to be profitable for private industry. And what is guaranteed to be profitable is… not exactly very innovative. By definition it has to be fairly obvious.

        So the other things you’ve mentioned are all implementations; optimisations towards local minima/maxima. Mass production is a more or less solved problem, figuring out how to deal with specific problems that crop up in individual production pipelines is work, certainly, but it’s not innovation. This is boilerplate stuff. Every factory since the dawn of factories has done it. And even to get there, you had to dig deep into relatively unknown silicon valley companies that most people have never heard of. They are not Apple, not even close.

        And the semiconductor industry is somewhat unique in that they’re in the middle of a half-century-long slide down towards a distant local minimum, specifically the size of their transistors. That means they can continue to make iterative changes that they know will give them payoffs. We know they know they will pay off because there are multiple companies in competition developing this technology in parallel and they are more or less keeping pace with one another. If they were coming up with true innovations that wouldn’t happen.

        Number of patents tells you nothing about what is actually being developed. I will admit I was perhaps too zealous in saying they “don’t develop”, because obviously the things you mentioned are a form of development, it’s just that they are predictable advancements, which means anybody with the funding to hire engineers could pull it off. They are advancements in the same way a train advances along a track. They are low-hanging fruit. These companies didn’t till the soil, plant the trees or nurture them. They just came along at harvest season, plucked the apples off the branches and called themselves farmers.

        If you call that sort of thing “innovation”, I would say you have a very low bar for that term.

        Also, it seems to me that you, like the other person, threw out a lot of technical sounding details all at once that certainly seemed to drive towards a point. Perhaps you hoped I couldn’t discern the difference between what you were saying and real innovation - you did say I had “no idea” - or perhaps you actually think they’re the same thing. In any case, I do understand enough about this process to see through this, but someone less technically informed wouldn’t. I’m not terribly impressed with someone attempting to bury me in the weeds like that.

        If you had a real slam-dunk example of innovation that you could point to, I imagine you would’ve done it. You can’t though, because for example with DRAM, future iterations of the same technology aren’t fundamentally any different and operate on the same basic principles developed in the public sector, and I think you know that.