• 0 Posts
  • 24 Comments
Joined 8 days ago
cake
Cake day: August 10th, 2025

help-circle
  • You seem to be assuming that the volume is immediately replaced by the external atmosphere, which I doubt is valid

    No, I was assuming your volume decreases. I don’t actually know that to be the case, but my assumption is that there isn’t “extra” space inside a person, and so if you lose material from a part of your body that isn’t encased in anything rigid your volume decreases slightly.

    So maybe I did have my terminology wrong. When a hot air balloon deflates, it falls. The density went up, but that’s not what’s directly relevant. The weight went down, I guess, but the “number on the scale”, weight minus buoyant force, went way way up, because it lost some lower-density volume that was making the whole thing float. The weight (in a strict physics sense) went down, sure. But the number on the scale (which I was incorrectly calling “weight”) went up. Same thing for a farting person.



  • Fart gas is warmer than the surrounding atmosphere, therefore less dense. Your digestive system is under very slight compression (10-20 mmHg gauge pressure according to the internet), which I would guess does not equate to enough pressure to be more significant than the temperature gradient. Fart gas is also less dense than air at a given pressure by a pretty significant margin (1.06 g/L compared with 1.20 g/L).

    When you fart, you’re releasing gas that is less dense than the atmosphere, which means you get slightly heavier. Think of yourself as a hot air balloon with a very tiny chamber, and when you release a 90 milliliter fart, you lose a little buoyancy and sink a little. You get heavier when you fart.

    I haven’t done the math, but I looked around on the internet at some numbers, and that’s what I think. I also ignored this because it is clearly AI slop, which is a little upsetting.



  • You define it in exactly the same way you just did. Completely fine, you have to do it for lots of things. It’s nice that Python can do that too.

    Now, I’ll grab a random snippet of code from some random file from my source dir:

            existing_bookmarks = db.session.execute(
                text('SELECT post_reply_id FROM "post_reply_bookmark" WHERE user_id = :user_id'),
                {"user_id": user_id}).scalars()
            reply = PostReply.query.filter(PostReply.id.in_(existing_bookmarks), PostReply.deleted == False).first()
            if reply:
                data = {"comment_id": reply.id, "save": True}
                with pytest.raises(Exception) as ex:
                    put_reply_save(auth, data)
                assert str(ex.value) == 'This comment has already been bookmarked.'
    

    You can see some classes in use, which again is fine. But you also see inline instantiation of some reply JSON, a database returning a list of post_reply_id values without needing a special interface definition for returning multiple values, lots and lots of cognitive and computational load per line of code that’s being saved because the language features are saving people the heavy lifting of depending on user-defined classes for everything. It means you don’t have as many adventures through the code where you’re trying to modify a user-defined interface class, you don’t need as much strong typing, that kind of thing.

    I would bet heavily that a lot of the things that are happening in that short little space of code, would need specific classes to get them done if the same project were getting implemented in some C+±derived language. Maybe not, I just grabbed a random segment of code instead of trying especially hard to find my perfect example to prove my point.

    It is fine, there are significant weaknesses to Python too, I’m not trying to say “yay python it’s better for everything,” anything like that. I’m just saying that if you don’t get familiar with at least some language that does things more that way, and instead get solely accustomed to just user-defined classes or templates for every information exchange or functional definition, then you’ll be missing out on a good paradigm for thinking about programming. That’s all.


  • Complex data structures are not “more of a C++ type of program structure”.

    Oh, they are not at all. Equating complex data structures with user-defined data structures (in the form of classes and fields and whatnot), and using the latter as the primary method of storing and working with data (so that you’re constantly having to bring into your mental scope a bunch of different classes and how they need to interact), is 100% a C++ type of program structure. It’s pretty unusual in my experience in Python. Or, I mean, it’s perfectly common, but it’s not primary in the same universal way that it is in C++ and derivatives. It gets to exist as its own useful thing without being the only way. That’s what I am trying to say.


  • IDK, I just have never really had this become a serious issue for me. I get what you mean, some actions are a little bit of a pain in the neck because people are often sloppy about typing, but literally the only time I can remember it being an issue at all has been when numpy is involved and so I have to figure out if something is a native Python thing or a numpy-fied custom structure.

    I mean there’s just not that many types. Generally something is a list, a number, a map, or a string, and it’s pretty obvious which. Maybe there are OOP domain things where a lot of variables are objects of some kind of class (sort of more of a C++ type of program structure), and so it starts to become really critical to have strong type tools, I’m just saying I haven’t really encountered too much trouble with it. I’m not saying it’s imaginary, you may be right in your experience, I’m just saying I’ve worked on projects way bigger than a few hundred lines and never really had too much of an issue with it in practice in my experience.


  • Plus I felt python was too new and would skip a lot of core programming skills id just like to know. Im not super interested in doing it the new way with all the helpers, or I wont feel like I learned anything.

    Okay, you definitely want to learn C then. C# and C++ both add a ton of helpers. C# has a massive runtime environment that’s opaque and a little bit weird, and C++ has a massive compile-time environment that’s opaque and very weird. It’s sort of pick your poison. If you learn C and get skilled with it, you’ll be well set up for understanding what is actually going on and having strong fundamentals that will set you up well for whatever higher-level language you want to learn in the future.

    Put another way: C# will hide just as many of the fundamentals and hardcore details from you as python will, it’ll just do it in a weird and counterintuitive fashion that will make it more confusing and with more weird C#-specific details.

    I’d eventually like to learn unity as well so i decided on c#

    I would actually just cut out the middleman and start with the Unity editor then. It actually might be a really good introduction to the nature of programming in general without throwing a bunch of extra nonsense at you, and in a really motivating format.

    I do have the .net sdk and it seems to try to compile a simple program, it just throws errors even on an example program that shouldn’t have any. Im sure its something dumb.

    What’s the program and what’s the error? I’m happy to help if something jumps out at me. I’m voicing my opinion otherwise on what might be better ways to attack this all in general, but I’m sure me or people here can help sort out the issues if you really want to take this approach and you’re just getting stuck on something simple.




  • I really would not recommend specializing in C# at this point in computing history. You can do what you want obviously, but Python is much more likely to be what you want. C++ or Java might be okay if you want a job and are okay with a little bit dated / not ideal languages, or you could learn one of the proliferation of niche backend Linuxy languages, but C# has most of the drawbacks of C++ and Java without having even their relative level of popularity.

    IDK what issue you’re having with VSCode, but I think installing the .NET SDK and then using dotnet by hand from the command line, to test the install, might be a good precursor to getting it working in VSCode. But IDK why you would endeavor to do this in the first place.









  • I won’t say 100%, but they’re generally pretty good. Big ones I can think of:

    • They’re going to apply every attack against Kamala Harris that they did against Biden
    • Trump is going to be infinitely worse for the Palestinians even than Biden was

    The first is a little bit qualified I guess. I was somewhat against replacing Biden for that reason (definitely before the debate), which was absolutely a mistake. But I think in retrospect, the way that they were able to blame Kamala Harris for Gaza and inflation and make it work was pretty spot-on to what I predicted.

    The second one, people were furiously telling me how wrong I was, how impossible it would be for anyone to be worse than Biden, and in early days saying that Trump had achieved a cease-fire and it was just proof of how easy it would have been if only Biden had put some slight effort to it.