• 0 Posts
  • 8 Comments
Joined 2 years ago
cake
Cake day: June 14th, 2023

help-circle
  • Enough of that crazy talk - plainly WheeledDeviceServiceFactoryBeanImpl is where the dependency injection annotations are placed. If you can decide what the code does without stepping through it with a debugger, and any backtrace doesn’t have at least two hundred lines of Spring boot, then plainly it isn’t enterprise enough.

    Fair enough, though. You can write stupid overly-abstract shit in any language, but Java does encourage it.



  • Well now. My primary exposure to Go would be using it to take first place in my company’s ‘Advent of Code’ several years ago, in order to see what it was like, after which I’ve been pleased never to have to use it again. Some of our teams have used it to provide microservices - REST APIs that do database queries, some lightweight logic, and conversion to and from JSON - and my experience of working with that is that they’ve inexplicably managed to scatter all the logic among dozens of files, for what might be done with 80 lines of Python. I suspect the problem in that case is the developers, though.

    It has some good aspects - I like how easy it is to do a static build that can be deployed in a container.

    The actual language itself I find fairly abominable. The lack of exceptions means that error handling is all through everything, and not necessarily any better than other modern languages. The lack of overloads means that you’ll have multiple definitions of eg. Math.min cluttering things up. I don’t think the container classes are particularly good. The implementation of pointers seems solely implemented to let you have null pointer exceptions, it’s a pointless wart.

    If what you’re wanting to code is the kind of thing that Google do, in the exact same way that Google do it, and you have a team of hipsters who all know how it works, then it may be a fine choice. Otherwise I would probably recommend using something else.


  • I feel that Python is a bit of a ‘Microsoft Word’ of languages. Your own scripts are obviously completely fine, using a sensible and pragmatic selection of the language features in a robust fashion, but everyone else’s are absurd collections of hacks that fall to pieces at the first modification.

    To an extent, ‘other people’s C++ / Bash scripts’ have the same problem. I’m usually okay with ‘other people’s Java’, which to me is one of the big selling points of the language - the slight wordiness and lack of ‘really stupid shit’ makes collaboration easier.

    Now, a Python script that’s more than about two pages long? That makes me question its utility. The ‘duck typing’ everywhere makes any code that you can’t ‘keep in your head’ very difficult to reason about.


  • Frezik has a good answer for SQL.

    In theory, Ansible should be used for creating ‘playbooks’ listing the packages and configuration files which are present on a server or collection of servers, and then ‘playing the playbook’ arranges it so that those servers exist and are configured as you specified. You shouldn’t really care how that is achieved; it is declarative.

    However, in practice it has input, output, loops, conditional branching, and the ability to execute subtasks recursively. (In fact, it can quite difficult to stop people from using those features, since ‘declarative’ doesn’t necessarily come easily to everyone, and it makes for very messy config.) I think those are all the features required for Turing equivalence?

    Being able to deploy a whole fleet of servers in a very straightfoward way comes as close to the ‘infinite memory’ requirement as any programming language can get, although you do need basically infinite money to do that on a cloud service.



  • addie@feddit.uktoProgrammer Humor@lemmy.mlIEEE 754
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    You can only store rational numbers as a ratio of two numbers, and there’s infinitely times more irrational numbers than rational ones - as soon as you took (almost any) root or did (most) trigonometry, then your accurate ratio would count for nothing. Hardcore maths libraries get around this by keeping the “value in progress” as an expression for as long as possible, but working with expressions is exceptionally slow by computer standards - takes quite a long time to keep them in their simplest form whenever you manipulate them.


  • My old job had a lot of embedded programming - hard real-time Z80 programming, for processors like Z800s and eZ80s to control industrial devices. Actually quite pleasant languages to do bit-twiddling in, and it’s great to be able to step through the debugger and see that what the CPU is running is literally your source code, opcode by opcode.

    Back when a computers were very simple things - I’m thinking a ZX Spectrum, where you can read directly from the input ports and write directly into the framebuffer, no OS in your way just code, then assembly made a lot of sense, was even fun. On modem computers, it is not so fun:

    • x64 is just a fucking mess

    • you cannot just read and write what you want, the kernel won’t let you. So you’re going to be spending a lot of your time calling system routines.

    • 99% of your code will just be arranging data to suit the calling convention of your OS, and doing pointless busywork like stack pointer alignment. Writing some macros to do it for you makes your code look like C. Might as well just use C, in that case.

    Writing assembly makes some sense sometimes - required for embedded, you might be writing something very security conscious where timing is essential, or you might be lining up some data for vectorisation where higher-level languages don’t have the constructs to get it right - but these are very small bits of code. You would be mad to consider “making the whole apple pie” in assembly.