• FizzyOrange@programming.dev
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    4 days ago

    Well, git is for source control, not binary artefacts

    Only because it is bad at binary artefacts. There’s no fundamental reason you shouldn’t be able to put them in version control.

    It’s not much of an argument to say “VCSes shouldn’t be able to store binaries because they aren’t good at it”.

    What are your requirements? What do you need this for?

    Typically there’s a third or first party project that I want to use in my project. Sometimes I want to be able to modify it too (soft fork).

    And why do you think everyone else needs the same?

    Because I’ve worked in at least 3 companies who want to do this. Nobody had a good solution. I’ve talked to colleagues that also worked in other companies that wanted this. Often they come up with their own hacky solutions (git subtree, git subrepo, Google’s repo, etc. etc. - there are at least half a dozen of these tools).

    It’s quite possible you are doing it wrong.

    No offence, but your instinctive defence of Git and your instant leap to “you’re holding it wrong” are a pretty dead giveaway that you haven’t stopped to think about how it could be better.

    • ysjet@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 days ago

      Just to jump in here, git submodules and similar are a terrible design pattern that needs killed, not expanded. Create a library properly and stop cutting corners that will bite you in the ass.

      Three seperate companies wanting to do it the lazy, wrong way doesn’t suddenly make it a good idea.

      • FizzyOrange@programming.dev
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        2 days ago

        Libraries are not always a suitable solution. You just haven’t worked on the same projects I have and you can’t imagine all the things submodules are used for.

        On top of that, I can’t force all third party projects to turn their repos into nice easily installable packages. Especially if they’re using a language that doesn’t have a package manager.

        • tyler@programming.dev
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          3 days ago

          I think the point the user was making is that, if it isn’t already distributed as a library, you can just fork it and deploy it as a library artifact to your company’s internal artifact repository. You shouldn’t be pulling an external project as a submodule, that’s just coupling yourself way way too tightly to external code. So you turn that code internal and into a library.

          • FizzyOrange@programming.dev
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            2 days ago

            You shouldn’t be pulling an external project as a submodule, that’s just coupling yourself way way too tightly to external code.

            You’re no more tightly coupled than if you zip that repo up, and put it on an internal server. It’s the exact same code you’ve just changed the distribution method.

            And my whole point is that wouldn’t be necessary if Git had a version of submodules that worked properly!

            You guys seriously lack imagination.

            • tyler@programming.dev
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              2 days ago

              I mean you are more tightly coupled. It’s way more likely that someone is going to pull the git submodule (especially if you’re doing this with multiple projects) than the someone updating the version of the library inadvertently. This applies even more if you’ve created the library and deployed it to your own artifactory yourself.

    • HaraldvonBlauzahn@feddit.orgOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      4 days ago

      Because I’ve worked in at least 3 companies who want to do this. Nobody had a good solution

      There are good solutions: Use proper package managers with automated build support like dpkg, pacman, pip or perhaps uv, or even better Guix. Companies not doing that are just cutting corners here.

      • FizzyOrange@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        3 days ago

        That can work in some cases, but it’s usually not that great for first party projects where you want to be able to see and edit the code, and most package managers are OS or language specific so they don’t work well with multi-language project or projects using a language that doesn’t have a good package manager (SystemVerilog for example).

    • HaraldvonBlauzahn@feddit.orgOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      4 days ago

      Only because it is bad at binary artefacts. There’s no fundamental reason you shouldn’t be able to put them in version control.

      There is a fundamental reason: You can’t merge them.

      • FizzyOrange@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        3 days ago

        So what? You can manually merge them. File locking is also a common solution (LFS supports that).

        The level of “you’re holding it wrong” here is insane.

    • Flipper@feddit.org
      link
      fedilink
      arrow-up
      2
      ·
      4 days ago

      Git was Made for the Linux kernel. The kernel is pretty much only text files. For the complete decentralisation git achieves an easy diffing and merging operations needs to be defined. It is working for what it was made.

      Large files don’t work with git, as it always stores the whole history on your drive.

      For files that are large and not mergeable SVN works better and that is fine. You need constant online connectivity as a trade of though.

      Some build tools for software being the option to define a dependency as a git path +commit or a local path. That works quite well but is in the end just a workaround.

      • FizzyOrange@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        Yes I’m aware where Git came from.

        Large files don’t work with git, as it always stores the whole history on your drive.

        Not any more. That’s only the default. Git supports sparse checkouts and blobless checkouts both of which only get a subset of a repo. And actually it has supported --depth for as long as I remember.

      • HaraldvonBlauzahn@feddit.orgOP
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        4 days ago

        For files that are large and not mergeable SVN works better and that is fine.

        This. I have worked for a large research organization where a single SVN checkout took more than 24 hours. And they knew what they were doing.

        BTW jujutsu, being created by an engineer who happens to work at Google, supports alternative backends which are meant for very large repos. But as said, I think that these do not align with the needs of the FOSS community.