Did #julialang end up kinda stalling or at least plateau-ing lower than hoped?

I know it’s got its community and dedicated users and has continued development.

But without being in that space, and speculating now at a distance, it seems it might be an interesting case study in a tech/lang that just didn’t have landing spot it could arrive at in time as the tech-world & “data science” reshuffled while julia tried to grow … ?

Can a language ever solve a “two language” problem?

@programming

  • tschenkel@mathstodon.xyz
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    @hrefna @maegul @astrojuanlu @programming

    Interesting. But seems to be limited to arrays. And I’m wondering what the benefit is over regular Numpy. In my tests I did not find Numpy to be slower than any other Blas or Lapack implementation (if type stable and immutable).

    Do you have any comparisons?

    In any case, I don’t think it would work for solving ODEs unless I’d implement the RHS in a JIT compiled function.

    I used to love Python, but now it looks like a collection of add-ons/hacks to fix the issues I have.

    • Hrefna (DHC)@hachyderm.io
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      @tschenkel

      Mostly its advantage as far as arrays go is its ability to push things out to an accelerator (GPU) without making code changes. Also its JIT functionality is a good bit faster than using pytorch’s (at least anecdotally).

      My experience with it is not at all related to ODEs (more things like MCMC) and I have no direct experience with its gradient functionality and only limited with its auto vectorization, so take my experience with a grain of salt.

      @maegul @astrojuanlu @programming