Did #julialang end up kinda stalling or at least plateau-ing lower than hoped?

I know it’s got its community and dedicated users and has continued development.

But without being in that space, and speculating now at a distance, it seems it might be an interesting case study in a tech/lang that just didn’t have landing spot it could arrive at in time as the tech-world & “data science” reshuffled while julia tried to grow … ?

Can a language ever solve a “two language” problem?

@programming

  • tschenkel@mathstodon.xyz
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    4 months ago

    @astrojuanlu @maegul @programming

    I think there’s two ways to think about the two-language problem (from my scientists view):

    1. from the developer side, when the target is a product/library it may indeed be better to develop in a typed, compiled labguage, and just have a wrapper in Python, or another high-level language that the users/scientists use.

    2. from the user/scientists side: start developing in Matlab/Python, build the proof of concept and demonstrate that the method works. But for the real case application, more performance is needed. So one needs to find a Rust/C++/Fortran developer to join the project. No funds for that, so code never gets developed.

    Julia solves the second kind nicely. If I had to rewrite everything in Rust/C(++)/Fortran, the last few years’ research would not have happened.

    #julialang

    • maegul@hachyderm.ioOP
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      @tschenkel @astrojuanlu @programming

      I’d suppose part of the problem might be that there’s a somewhat hidden 3rd category of user that “feels” whatever added complexity there is in a two-language lang like julialang and has no real need for performant “product” code.

      And that lack of adoption amongst this cohort and your first enforces lang separation.

      I may be off base with whether there’s a usability trade off, but I’d bet there’s at least the perception of one.

      • tschenkel@mathstodon.xyz
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        @maegul @astrojuanlu @programming

        You put your finger right on it. If I tell people that they can replace Matlab or Python with Julia, then there is often not the pressure justifying learning another language.

        But when I show them, that (a) the fundamental syntax is nearly the same as Matlab or (b) that the same thing in Python is much more clunky (Numpy and Scipy are not exactly elegant in my eyes), and they can get 40 and 600 times speedup respectively, then there is much more interest.

        I needed the performance and DifferentialEquations.jl gave it, and my students stay for the easier language. But then they don’t have a long time with Matlab or Python under their belts, so the perceived pain of learning a new language to the same level is less, of course.

        At the moment we still teach Matlab mostly, because that’s what employers expect, but some of my students choose Julia in addition to it. And they do like it.

        More and more universities move their scientific “computational thinking” type of classes to Julia (from either Matlab or Python), which may move the trend when their will be more STEM graduates knowing the language.

      • tschenkel@mathstodon.xyz
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        @maegul @astrojuanlu @programming

        BTW, I’m not talking about “product code” in Julia, necessarily. The point is that I can stay at the “research code” level and don’t have to rewrite anything, but can just move from a toy problem with a few parameters to a full size mod with 100s of parameters.

        In my case I had a Python code solving an ODE system and a global optimisation of all parameters took a week on the 24 cores on a cluster. The same case runs over the lunch break on my laptop in Julia.

        For the most recent paper a PhD student of ours use 2000-ish GPU hours for a global sensitivity/uncertainty analysis. That would have been impossible in Python or Matlab. On a single GPU that would have taken 10 years (300 years in Python), instead of 3 months*.

        *yes, you can just through more GPUs at the problem, but then we can start the discussion about CO2 footprint being 40 or 600 times higher, respectively.

        Note: all our benchmarks are for solving ODE system. Just doing linear algebra, there isn’t really a difference in speed between Matlab, Bumpy, or Julia, because that’s all going to run on the same LA libraries.

      • Hrefna (DHC)@hachyderm.io
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        @maegul

        Considering, it may be worth highlighting that tools like Jax exist as well (https://github.com/google/jax). These have even become an expected integration in some toolkits (e.g., numpyro)

        It may not be the most elegant approach, but there’s a lot of power in something that “mostly just works and then we can optimize narrowly once we find a problem”

        It doesn’t make a solution that solves this mess bad, but I do wonder about it being a narrow niche

        @tschenkel @astrojuanlu @programming

        • tschenkel@mathstodon.xyz
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          @hrefna @maegul @astrojuanlu @programming

          Interesting. But seems to be limited to arrays. And I’m wondering what the benefit is over regular Numpy. In my tests I did not find Numpy to be slower than any other Blas or Lapack implementation (if type stable and immutable).

          Do you have any comparisons?

          In any case, I don’t think it would work for solving ODEs unless I’d implement the RHS in a JIT compiled function.

          I used to love Python, but now it looks like a collection of add-ons/hacks to fix the issues I have.