Also known as @VeeSilverball

  • 3 Posts
  • 27 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle
  • Some of my own thoughts, which rebut the article in parts:

    1. Godot does have “barbell performance” - you can make it go fast if you drop to C++ and do low-level engine things to add new nodes, resources, etc. You can also make it go fast when you use the premade nodes without a great deal of script in between(and the nodes are, FWIW, pretty flexible and composable). What it doesn’t do at present is the thing Unity users are used to, which is “fast scripting”. Fast scripting still means working around the garbage collector and the overheads of going between native and a runtime. C# is a kind of flytrap for the needs of high-end games, and Unity has only seemingly surmounted the issues by doing a lot of custom engineering for their use-case. That is, you don’t really code standard C# in Unity, you code Unity’s C#, which is nearly as bespoken as GDScript.
    2. Saying the engine is coded in a naive way is actually not as smart as it seems, because there’s a maintenance cost to always doing things in exactly the most optimal way. The target for what is fastest changes every time the platform changes. As a (up until recently) relatively small project, it’s overall better that the engine stay relatively easy to build and straightforward to modify, which is what it’s done. The path it’s taken has helped it stay “lightweight”. The price of that is that sometimes it doesn’t even take low-hanging fruit that would be a win for 90% of users.
    3. The 3D in Godot 4 is capable of good test scenes, but everyone seems to agree that it’s not really ready for production for speed reasons. Any specific point on this just backs that up. And that’s disappointing in one sense, but pretty okay in others. If you need high-end graphics, Unreal will welcome you for the time being.
    4. On that note, developing for console always comes with fussy limitations, at minimum just meeting TRC/TCR/lot check; that’s why professional porting is a thing. Engine devs usually end up in the position of maintaining these multiple-API abstractions because it’s necessary for porting. It’s the same deal with the audio code, the persistent storage, the controllers, the system prompts, it just goes on and on like that. So, rewriting the rendering bindings to do things in the D3D way and not the Vulkan way is actually a bit of a whatever; it’s more rendering code. It changes some assumptions about what binds to what. But it accesses the same kind of hardware, running the same kind of shaders. A lot of ports in the not-so-distant past basically had to start over because the graphics hardware lacked such a common denominator.

    The author’s bio says that they have been doing this as a professional for about 5 years, which, face value, actually means that they haven’t seen the kinds of transitions that have taken place in the past and how widely game scope can vary. The way Godot does things has some wisdom-of-age in it, and even in its years as a proprietary engine(which you can learn something of by looking at Juan’s Mobygames credits the games it was shipping were aiming for the bottom of the market in scope and hardware spec: a PSP game, a Wii game, an Android game. The luxury of small scope is that you never end up in a place where optimization is some broad problem that needs to be solved globally; it’s always one specific thing that needs to be fast. Optimizing for something bigger needs production scenes to provide profiling data. It’s not something you want to approach by saying “I know what the best practice is” and immediately architecting for based on a shot in the dark. Being in a space where your engine just does the simple thing every time instead means it’s easy to make the changes needed to ship.


  • The game has already consumed over 40 hours of my time, and I’ve got plenty more campaign to go. It does just about all the stuff I wanted JA2 to have to make it play faster - combat is faster, looting is faster, inventory is faster. It has a few things that look like X-COM, but it still mostly plays like JA. The early game is the roughest part but things definitely shaped up once I had a team with size, experience and gear.

    And the campaign is detailed with a few surprises and plenty of side quests - it does some things to pull the rug on you, which is rude, but rewarding if you play along and accept a few losses(or carefully savescum and go out of your way to avoid triggering timed quests).



  • Arch is always “latest and greatest” for every package, including the kernel. It lets you tinker, and it’s always up to date. However, a rolling release introduces more ways to break your system - things start conflicting under the hood in ways that you weren’t aware of, configurations that worked don’t any longer, etc.

    This is in contrast to everything built on Debian, which Mint is one example of - Mint adds a bunch of conveniences on top, but the underlying “how it all fits together” is still Debian. What Debian does is to set a target for stable releases and ship a complete set of known-stable packages. This makes it great for set and forget uses, servers that you want to just work and such. And it was very important back in the 90’s when it was hard to get Internet connectivity. But it also means that it stays behind the curve with application software releases, by periods of months to a year+. And the original workaround to that is “just add this other package repository” which, like Arch, can eventually break your system by accident.

    But neither disadvantage is as much of a problem now as it used to be. More of the software is relatively stable, and the stuff you need to have the absolute latest for, you can often find as a flatpak, snap, or appimage - formats that are more self-contained and don’t rely on the dependencies that you have installed, just “download and run.”

    Most popular distros now are Arch or Debian flavored - same system, different veneer. Debian itself has become a better option for desktop in recent years just because of improvements to the installer.

    I’ve been using Solus 4.4 lately, which has its own rolling-release package system. Less software, but the experience is tightly designed for desktop, and doesn’t push me to open terminals to do things like the more classical Unix designs that guide Arch and Debian. The problem both of those face as desktops is that they assume up-front that you may only have a terminal, so the “correct way” of doing everything tends to start and end with the terminal, and the desktop is kind of glued on and works for some things but not others.


  • My principle of “blockchain’s fundamental value” is simply this: A blockchain that secures valuable information is valuable.

    To break that down further:

    • “Valuable information” isn’t data - it’s something that you can interpret, that has meaning and power to affect your actions. So, price speculation taking place on a chain isn’t that valuable in a broad, utilitarian sense, but something like encyclopedic knowledge, historical records, and the like might be. The sense of “this is real” vs “this is Monopoly money” is related to the information quality.
    • “Secures” means that we have some idea of where the information came from, who can access it, and whether it’s been altered or tampered. Most blockchains follow the Bitcoin model and are fully public ledgers, storing everything - and just within that model(leaving aside Monero etc.) there are positive applications, but “automatically secure” is all dependent on what application you’re aiming for.

    You don’t need to include tokens, trading, finance, or the specific method of security, to arrive at this idea of what a blockchain does, but having them involved addresses - though maybe without concretely solving - the question of paying upkeep costs, a problem that has always dogged open, distributed projects in the past. If the whole chain becomes more valuable because one person contributes something to it, then you have a positive feedback loop in which a culture of remixing and tipping is good. It tends to get undercut by “what if I made scam tokens and bribed an exchange to list them”, the maxi- “we will rule the world” cultures of Bitcoin and Ethereum, or the cynical “VC-backed corporate blockchains”, but the public alt chains that are a bit out of the spotlight with longer histories, stuff like Tezos and NEM/Symbol, tend to have a more visible sense of purpose in this direction - they need to make a myth about themselves, and the myth turns into information by chance and persistence.

    What tends to break people’s brains - both the maxis, and people who are rabidly anti-crypto - is that securing on-chain value in this way also isn’t a case of “public” vs “private” goods. It’s more akin to “commons” vs “enclosed” spaces, which is an older notion that hasn’t been felt in our political lives in centuries, because the partnership of nation-states and capital has been so strong as a societal coordinating force - the state says where the capital should go, the people that follow that lead and build out an empire get rewarded. The commons is, in essence, the voice in the back of your mind asking, “Why are you in the rat race? Do you really need an empire?” And this technology is stating that, clearly and patiently: making a common space better is another way to live.

    And so there is a huge amount of spam around “ownership”, but ownership itself isn’t really a factor. That’s just another kind of information that the technology is geared towards storing. The social contract is more along the lines that if you are doing good for a chain and taking few risks, a modest, livable amount of credit is likely to flow to you in time. Everyone making “plays” and getting burned is trying to gamble with it, or to advance empire-building goals in a basically hostile environment that will patch you out of the flow of information.



  • I have no plans to support p92 precisely because it’s going to “push” users together as a commodity. What Meta has jurisdiction over is not its communities but rows of data - in the same way that Reddit’s admins have conflicted with its mods, it is inherently not organized in such a way that it can properly represent any specific community or their actions.

    So the cost-benefit from the side of extant fedi is very poor: it won’t operate in a standard way, because it can’t, and the quality of each additional user won’t be particularly worth the pain - most of them will just be confused by being presented with a new space, and if the nature of it is hidden from them it will become an endless misunderstanding.

    If a community using a siloed platform wants to federate, that should be a self-determined thing and they should front the effort to remain on a similar footing to other federated communities. The idea that either side here inherently wants to connect and just “needs a helping hand” is just wrong.


  • I believe there is a healthy relationship between instances and magazines, actually: the way in which topical forums tend to be “hive-mindy” fits well with Fediverse instance culture. The difference is that instead of Reddit-scaling leading in the direction of “locking down” topical discussion to be a bureaucratic game of dancing around every rule, because all users are homogenous - just a name, a score, and a post history - you can have “this board is primarily about this” but then allow in a dose of chaos, affording some privilege to the instance users who already have a set of norms and values in mind and pushing federated comments out of view as needed, where you know the userbases are destined to get into unproductive fights.

    This also combats common influencer strategies applying bots and sockpuppeting, because you’ve already built in the premise of an elite space.

    There’s work needed on the moderation technology of #threadiverse software to achieve this kind of vision, but it’s something that will definitely be learned as we go along.


  • I find that what is needed depends on the task. Mostly, it’s about whether you need to switch views on information frequently. If you’re working in a maximally focused way you already have the right info, so you don’t have to make the view more diverse.

    Two monitors can be really helpful if you’re in a situation where you need one view to always stay the same(e.g. reading one document while editing another) and the editing app is some fussy internal thing that always wants to be on the first window when started, but I also haven’t had that setup in quite a few years. Tiling can get you 80% of that if the screen is sufficiently large and the software cooperates.

    When in Windows I stick to using the Win + arrow keys shortcuts to tile; in Linux I’ve used a few different WMs over the years but lately have been using Ubuntu defaults and basically working with it like Windows.

    There is a lot of utility from not relying on screens and using a small gridded or ruled notebook with a spiral binding as the second screen. Mark it up with color multipens and sticky notes, and take it around in your jacket pocket or a belt bag.


  • Mastodon’s export portability mostly focuses on the local social-graph aspects(follows, blocks, etc.) and while it has an archive function, people frequently lament losing their old posts and that graph relationship when they move.

    Identity attestment is solvable in a legible fashion with any external mechanism that links back to report “yes, account at xyz.social is real”, and this is already being done by some Mastodon users - it could be through a corporate web site, a self-hosted server or something going across a distributed system(IPFS, Tor, blockchains…) There are many ways to describe identity beyond that, though, and for example, provide a kind of landing page service like linktree to ease browsing different facets of identity or describe “following” in more than local terms.

    I would consider these all high-effort problems to work on since a lot of it has to do with interfaces, UX and privacy tradeoffs. If we aim to archive everything then we have to make an omniscient distributed system, which besides presenting a scaling issue, conflicts with privacy and control over one’s data - so that is probably not the goal. But asking everyone to just make a lot of backups, republish stuff by hand, and set up their own identity service is not right either.



  • If you look at the links by each post, you’ll notice that some will reference a URL that goes off of your local instance. In Lemmy these are icons, in kbin it appears from the “more” link. Sometimes it’s unclear who/where I’m interacting with and examining the URL helps me get some idea of it. In federated social media different instances often develop a different subculture, but since they can access each other you have more dimensions of interaction and how to behave.







  • Not dead, just sleeping. It’s a tougher, higher interest-rate market which cuts out a lot of the gambling behavior. I remain invested but my principle has shifted away from the financial and trad-economic terms to this:

    Blockchains are valuable where they secure valuable information. Therefore, if a blockchain adds more valuable information, it becomes more valuable.

    And that’s it. You don’t have to introduce markets and trading to make the point, but it positions those elements in a supporting role, and gets at one of the most pressing issues of today: where should our sources of truth online start? Blockchains can’t solve the problems of false sensation, reasoning or belief, but they fill in certain technical gaps where we currently rely on handing over custody to someone’s database and hoping nothing happens or they’re too big to fail. It’s just a matter of aligning the applications towards the role of public good, and the air is clear for that right now.




  • I think it’s reasonable for some instances, where there’s good alignment. There was a thread I replied in a few days back around how/if TTRPG creators(who are mostly small enthusiasts themselves) could advertise in related magazines, and legitimizing that business wouldn’t really pose a conflict for the hobby - that’s how it was built in the first place! It’s just a matter of finding a place for it and defining the technical solutions.

    As a general “let in all the advertisers and promise riches for someone” measure, it does cause known problems. There is some freedom to figure out what works in a specific case here, it’s not defined top-down since it isn’t centralized.