Monocle: Optics Library for Scala

(optics.dev)

103 points | by curling_grad 4 days ago

63 comments

  • openplatypus a day ago

    Yes, using Monocle for years now. So happy to have rich ecosystem of libraries in Scala land.

  • solid_fuel a day ago

    I haven't encountered this pattern before. Is there some more information on what problems this is designed to solve?

    • dkarl a day ago

      The problem most programmers would be familiar with is making an update inside a deeply nested immutable data structure. For example, suppose you want to update a user's billing address, and you have an immutable data structure that looks like this:

          user { billingInfo: { card, address }, name, subscription { level, expiration }, status { level, since } }
      
      The structure is immutable, so you can't make the update in place. On the other hand, you're only changing one field, so it would be wasteful to make a complete deep copy. The efficient way to create an updated instance is to create a new user instance and a new billingInfo instance while reusing the name, subscription, and status instances.

      You can think of this as the equivalent of a setter for immutable data structures.

      This is an artificial example, because the cost of making a deep copy of this user structure is probably not that bad, and the boilerplate to make an efficient update is not all that bad, either. You would use an optics library when 1) you need efficient updates and 2) it's worth investing a little effort to hide the boilerplate.

      Optics also let you concisely express access into a deeply nested structure, the getter paired with the setter. In my experience, updates are the motivation for setting up optics, and concise access is a nice thing you get as a bonus.

      • afandian 11 hours ago

        Is this much different from Clojure’s `update-in` ? You just express a path as a sequence of keys / indexes, and a function to apply, and let the persistent data structure get updated.

        It's very neat. But depends on a type system built for this kind of use.

        https://clojuredocs.org/clojure.core/update-in

      • usrusr 20 hours ago

        Does it offer creating a mutable view on top of the immutable structure that can be used to accumulate a whole set of changes to later be materialized into a copy with the changes? (or to be used directly, at whatever cost would be required) That's something I've been wondering why it's not more of an established thing. It would basically be docker, but for in-memory reference graphs instead of filesystems.

    • Nullabillity a day ago

      If monads are programmable semicolons (ways to chain operation), lenses are programmable dots (ways to delegate access to data). Other optics are largely generalizations of that pattern.

    • AlotOfReading a day ago

      Lens are the functional version of getters and setters. A lens takes a product type (struct, class, etc) and allows you to view or update part of it. Prisms are something similar for sum types (variants) that allow you to look at a value if it's present and err otherwise.

      The optical analogy comes from how these operations resemble zooming in on structures with a magnifying glass and the entire family of related transformations is called optics.

    • kqr 13 hours ago

      Aside from making it convenient to build getters and setters for nested data, optics also have conditional accessors and collection accessors as building blocks.

      Conditional accessors have since become a popular language feature with e.g. the elvis operator ?. in C#. Optics make it possible to innovate on features like that in library code rather than as language features.

      Something I've yet to see made into a language feature that is common in optics libraries are iterating accessors. E.g. to reset all counters we can, with optics, say something like (in Haskell syntax, since that is what I know)

          stats.each.count .= 0
      
      and that sets the count to zero for all objects in the stats array. If not all stats objects have a count (some might be of a gauge type, for example) we can compose in a conditional accessor that resolves the count field only if it is not null:

          stats.each.count._Just .= 0
      
      In the above statement, nothing is language syntax --it's all library functions and operators. But it still combines really well on the page. Once one knows optics, one rarely has to think very hard about how to do any get or update operation, even when the data types become complicated.
    • jeremyjh a day ago

      Lenses make it easier to read and update members deep in a hierarchy of read-only data structures.

    • 62951413 3 hours ago

      There's one book that discusses a few related questions from a very mainstream perspective: Functional and Reactive Domain Modeling (https://www.manning.com/books/functional-and-reactive-domain...). The emphasis here is on practical while most similar resources tend to go too hard in the Haskell-on-the-JVM direction.

  • btreecat 8 hours ago

    I'm still trying to reason what the value proposition of Scala is in a business sense.

    Developer are more expensive and harder to find, the tooling is weaker, the ecosystem less deep, performance suspect, and the overall XP feels clunky.

    Plus our hardware is procedural, and contains/manages state within the instruction pipeline.

    I strongly believe it doesn't belong as a core business technology.

    • halfmatthalfcat 44 minutes ago

      In the right hands, the terseness and expansive nature of the language (big stdlib, java interop, implicits/givens, higher-kinded types, immutability first, etc) can greatly increase productivity, however it takes a higher caliber of developer to wield it effectively. Not to mention also the rich ecosystem in the functional side of the house with ZIO/Typelevel, or distributed frameworks like Akka.

      Poor Scala devs output complete junk, however high performers produce incredibly elegant and concise code.

    • noelwelsh 7 hours ago

      I don't think this comment reflects reality. A few issues:

      * Complaints about tooling and ecosystem were valid maybe 10 years ago, but not now.

      * You don't say what your point of comparison is. (weaker than what?)

      * Performance has never been an issue for general programming tasks

      * You disregard the value that language brings

      * Running functional programs on a CPU has not been an issue since, roughly, the 1970s when the basics of compiling FP languages were worked out.

      • btreecat 7 hours ago

        None of that answered the question of what's the value proposition is, while also missing the point of several criticism.

        • noelwelsh 6 hours ago

          The Scala value prop is a modern programming language (= safer, more correct code, which implies faster development) + access to the JVM and JS ecosystems (and WASM and Native in more nascent states.)

          Given your comment about stateful CPUs I imagine you don't think programming languages are particularly important. Not everyone does, but opinions do differ on that.

          > also missing the point of several criticism

          I can't correct my misunderstandings if you don't state what they are.

    • lanthissa 5 hours ago

      So the starting point is you want the JVM/Java interop, if you don't want the JVM find something else.

      From there you have Java, Scala, and Kotlin as primary choices.

      Up until 2020 java didn't have pattern matching or records, and kotlin still doesn't have a full implementation.

      Scala offers an ecosystem where its very easy to build a robust type system, supporting clear co and contravariance and clear control flow due to pattern matching.

      On top of that scala's compiler plugin ecosystem is quite good and it allows large organizations to abstract away a lot of busy work to dedicated internal teams. This is used extensively in large scala codebases I have seen.

      Scala is a fantastic language for core business technology, in fact thats probably the single thing its best at. Its why the largest code bases I know of are in financial institutions.

      DX/tooling is a valid point, but the large scala codebases that exist within orgs have their own tooling that solves this, at least old twitter and the financial institutions do.

      So to answer your question, why would someone choose scala in a business sense. Any JVM project built between like 2009-2019 had a compelling reason to choose scala over the other options and maintains those advantages though other languages have implemented most of the key features so the gap is much smaller.

    • hocuspocus 4 hours ago

      The small pool of experienced developers is actually awesome. I waste a lot less time screening candidates and they're usually above average.

      Moreover, despite what people claim I've had no issues mentoring juniors or people coming from other paradigms but willing to learn.

      Tooling isn't as good as plain Java but good enough, and certainly better than that of more niche FP languages.

      The ecosystem is pretty rich in spaces where Scala really shines, in some cases it's even better than libraries and frameworks you can find in Java or Kotlin. Plus you can always consume any Java library.

      Performance is good enough for Twitter or big data workloads at major tech companies (many of them use Spark). And nowadays with GraalVM native images you can even reduce the memory footprint to very reasonable levels, not very far from Go.

      • btreecat 3 hours ago

        You have a lot fewer candidates to pull from too, which is often a more limiting factor for a business.

        Business don't need everyone to be a rockstar who's also a FP whiz. But FP languages need that.

        • hocuspocus 25 minutes ago

          I've been leading hiring efforts for my team for a while now and Scala has never been the bottleneck. Especially now that the Scala market has shrunk a bit.

          It's true you need some level of expertise and seniority, but that doesn't mean you need to hire only FP wizards, far from it. I believe most modern hybrid languages can be dangerous without supervision.

        • GregarianChild 3 hours ago

          For what it's worth, Scala is a multi-paradigm language and can be used as a pure OO language. This is a great way to start for programmers with an OO background

          • hocuspocus 22 minutes ago

            I'm not sure it's a good idea though. Scala failed (mainly to Kotlin) as a "better Java". If you aren't committed to Scala's unique features, I would avoid it. It doesn't mean you need to go all in though, Scala can be simple, simpler than Java in fact.

    • wbl 3 hours ago

      Hardware is actually functional. It will do whatever is needed to get the results you actually want while secretly rearranging everything. And I mean everything: loads and stores don't execute in anything resembling the order you wrote them in. State? An illusion that doesn't even exist the majority of the execution time and has to be expensively reconstructed when things go wrong.

      Circuits are deeply functional: everything is computed all the time and only selected from.

    • fifilura 6 hours ago

      Apache Flink has deprecated support for Scala.

      https://cwiki.apache.org/confluence/display/FLINK/FLIP-265+D...

      An aggregation engine like that was otherwise a good fit for a functional-ish language like Scala.

      That said - it is a nice language. Native support for simple functional language constructs (map / flatmap, Option, Try), without having to go all-in on functional style.

    • ndriscoll 6 hours ago

      Performance is in the same space as java or go, but it's a much higher level language, so easier to write, easier to read, and easier to review. The language prevents many classes of errors without cluttering the written logic, so it's easier to understand what it's doing in terms of business domain. The jvm ecosystem is also massive.

      If developers are more expensive, presumably there's a reason they're able to ask those rates besides "they know scala".

      • btreecat 4 hours ago

        > Performance is in the same space as java or go, but it's a much higher level language, so easier to write, easier to read, and easier to review.

        What metrics are you using to make the claim that it's "easier" than more widely known, better tooled, and easier to search languages?

        >The language prevents many classes of errors without cluttering the written logic, so it's easier to understand what it's doing in terms of business domain.

        I've not seen this play out on a codebase of nontrivial size. I find it forces new ways of doing the same old stuff because "functional"

        > If developers are more expensive, presumably there's a reason they're able to ask those rates besides "they know scala".

        Why would you presume that it's not a simple supply and demand?

        The fact is there a fewer FP devs. Why is that if FP is "better"?

        There's fewer Scala devs, because Scala isn't better.

        • GregarianChild 3 hours ago

          > if FP is "better"

          The term FP has lost precise meaning over time, and split into several related meanings.

          • Focus on the absence of side-effects. (E.g. Agda, but not Scala, Ocaml, F#)

          • Focus on the higher-order functions. (This is no longer controversial, but used to be)

          • Focus on rich types. (The ML (the programming language) tradition, note that Lisp and even Lambda-calculus were born untyped)

          • Focus on the use of traditional functional idioms, e.g. map-reduce, even when the implementation is totally not functional.

          Which one are you referring to?

        • ndriscoll 3 hours ago

          My claim is backed by years of direct experience with C, C++, Java, Go, Scala, PHP, and Python. I can tell you that Scala has been the easiest to write, read, and review by far. I'm not sure why you'd expect "easier to read and review" to be metrizable. Nor do I see the connection between those things and wide use, tooling, or searchability. If you need to search something to understand what you're reading, I suppose that itself is a measure of that code being difficult to understand (i.e. it does not stand on its own).

          > I've not seen this play out on a codebase of nontrivial size.

          I have :shrug:. There's currently this discussion[0] about some functional design principles on the front page. Things like making illegal states unrepresentable mean that you just don't have to think at all about various would-be error paths (again, making it easier to write, easier to read, and easier to review), and while Scala doesn't automatically give you such a design, it at least makes it possible (unlike e.g. Go). You just need a good lead to guide people's design.

          In fact "make illegal states unrepresentable" is perhaps just one facet of a more general observation that if you accurately model your domain, the code basically writes itself. IME Scala's type system hits a sweet spot of giving you the ability to model what you need without forcing you to model concerns you don't want to care about (e.g. Rust's lifetimes).

          Demand for what? Why would there be intrinsic demand for Scala programmers? Why isn't there a similar premium for Perl or COBOL programmers? It's not like Javascript or C/C++/Rust where you have limited options of what languages can be used in some domain (browsers, embedded, etc.). It's a high level application language, which is a space with tons of other languages.

          One possible contribution is that it has features that more experienced engineers can appreciate (like a compiler that helps prevent basic errors and a good concurrency story and workable macros so that you can focus on teaching your team domain context instead of looking for errors with nil references or locks or whatever or poring over tons of boilerplate. Performance is also good enough that you probably don't need to worry about e.g. horizontal scaling), so you may tend to see them wanting to use it more.

          e.g. I did a few years of PHP (Laravel) work. You won't ever find me taking a PHP job again; I've had enough of it for one lifetime and don't need that kind of stress. My understanding is Ruby/Rails are similar, so again I just wouldn't consider a Rails job. I'd probably prefer working with Scala, C++, or maybe Rust, so you're more likely to find me in a "Scala programmer" statistic and won't find me in a "PHP programmer" statistic, but that's not really why I cost more. I expect more money than your average developer because I have more experience than your average developer and I've been in tech lead roles for a while now.

          [0] https://news.ycombinator.com/item?id=42244851

    • 62951413 3 hours ago

      You're unfortunately right. Every Scala codebase I have seen recently is essentially legacy. New services are not built with it anymore. Other than blaming SBT and the Haskel fans I'm not aware of a technical justification. But the industry/community has shifted since 2015-2018.

      The only rare exceptions I heard of were strict FP shops who share nothing with common JVM-based development. You could have a few years of full-time Scala experience with Akka and Spark and they won't even screen you.

      In the year of our Lord 2024 the question is if you want to bet on Java21+ catching up with Kotlin or go with Kotlin from day one. The choice is less obvious nowadays. I'd go with a modern language, actual job market sides with Java still. At least on the backend in the Bay Area.

      As a side note, I cannot imagine a competent JVM-based developer not familiar with either Kotlin or Scala by now. In a typical Java shop half of the team is dying to switch to one of them in my experience.

    • dionian 6 hours ago

      It's a more powerful version of Java. Java is still trying to backport features from Scala.

  • mhitza 21 hours ago

    Does the LSP provide clear autocomplete on what properties can be accessed on the bound _ ?

    Asking as someone that doesn't use Scala at all, but has seen the hit-and-miss of some FP language LSPs.

    • valenterry 18 hours ago

      Yes it does, otherwise the code would actually not compile.

    • lmm 18 hours ago

      IntelliJ or the older Scala-IDE for Eclipse certainly does, so I'd be very disappointed if the LSP impl (which the Scala maintainers have been pushing as the official IDE replacement these days) didn't.

  • wk_end a day ago

    Also available for TypeScript:

    https://gcanti.github.io/monocle-ts/

  • neonsunset a day ago
  • henning a day ago

    So behind the scenes, every one of those statements will make a whole new user object with a whole new address object so that it remains immutable? And whether that will actually have any real-world performance impact is I guess entirely situational. Still, what happens if you do that with a big object graph?

    Also, the original strong need for immutable data in the first place is safety under concurrency and parallelism?

    • sriram_malhar a day ago

      Yes, behind the scenes every one of those statements will make a shallow copy of the object. But it isn't just that object necessarily. For example, if you modify a tree node, then not only does that node needs cloning, its parent does too (since the modified parent needs to point to the new node), and so on until the root, which results in h = O(log(n)) new objects to create an entirely new tree. (h is the height of the tree).

      What you get out if it is (a) safety, (b) understandability, which are wonderful properties to have as long as the end result is performing adequately. Implementing concurrent tree or graph traversals under conventional mutation is painful; the Java collection libraries simply throw a ConcurrentModificationException. The equivalent code for readonly traversals of immutable data structures is simplicity itself. You also get versioning and undo's for free.

    • Nullabillity a day ago

      > Still, what happens if you do that with a big object graph?

      The only thing that really matters here is how deep the graph is. Any unchanged object can just be reused as-is.

    • kelnos a day ago

      This is in general how "mutations" are supposed to be done in a language like Scala (and is not unique to this library). Yes, Scala does have a set of mutable collections, but the immutable collections are heavily optimized to make creating a "new" collection with a mutation much cheaper than having to copy the entire collection.

      Of course, copying a case class in order to change a field likely does require a full copy of the object, though since this is the JVM, things like strings can be shared between them.

      Ultimately this pattern is... fine. Most uses don't end up caring about the extra overhead vs. that of direct mutation. I don't recall if the Scala compiler does this, but another optimization that can be used is to actually mutate an immutable object when the compiler knows the original copy isn't used anywhere else after the mutation.

      > Also, the original strong need for immutable data in the first place is safety under concurrency and parallelism?

      That's one of the uses, but multiple ownership in general is another, without the presence of concurrency.

      On top of that, there's the general belief (which I subscribe to) that mutation introduces higher cognitive load on someone understanding the code. Immutable data is much easier to reason about.

    • lmm 18 hours ago

      > So behind the scenes, every one of those statements will make a whole new user object with a whole new address object so that it remains immutable?

      Not a "whole new" one since it will use shared references to the parts that didn't change (which is valid since they're immutable). And in principle the VM could even recognise that the new object is replacing the old one so it can be edited in place.

      > Still, what happens if you do that with a big object graph?

      I've literally never seen it cause a real-world performance problem, even if it theoretically could.

      > Also, the original strong need for immutable data in the first place is safety under concurrency and parallelism?

      Partly that, but honestly mostly development sanity and maintainability. You can iterate a lot faster on immutable-first codebases, because it takes much less test coverage etc. to have the same level of confidence in your code.

    • valenterry 18 hours ago

      Your question is a bit like someone asking "so what does the garbage collector actually do? Does it X or Y? What impact does it have?"

      And the answer is: no need to care about it. Unless you need to really optimize for high performance (not necessary in 99% of the cases, otherwise you'd use a different language from the beginning anyways).

      > Also, the original strong need for immutable data in the first place is safety under concurrency and parallelism?

      One of the reasons is that you really can just completely stop thinking about it. Just like you can stop thinking of (de)allocations. Except for some edge-cases when performance matters a lot.

    • threeseed a day ago

      > actually have any real-world performance impact

      There are many techniques like this within Scala that would never be feasible if it wasn't for the fact that the JVM is ridiculously fast. You could write the worst code imaginable and in many cases would still have better performance than Python, Javascript etc.

  • ldjkfkdsjnv a day ago

    Every scala code base I have worked on, that wasnt written by small team of experts, turned into a huge pile of crap. A small squad of people that treat the language like a religion create an impenetrable masterpiece

    • threeseed a day ago

      A lot of work has been done in Scala 3 to simplify everything.

      And with the arrival of virtual threads in the JVM there are new concurrency libraries e.g. Ox [1] and Gears [2] which remove the need to use FP concepts like monads. Which have been the major source of much of the complexity.

      For all its problems it is a seriously under-rated platform especially Scala.js which IMHO is far better and simpler than Typescript.

      [1] https://github.com/softwaremill/ox

      [2] https://github.com/lampepfl/gears

    • Sunscratch a day ago

      Every <insert any language here> code base I have worked on, that wasnt written by small team of experts, turned into a huge pile of crap…

    • wiml 13 hours ago

      You're going to have that problem with any codebase written by people who don't particularly know the language. Typescript written by PHP programmers, Python written by Java programmers, you'll quickly get a huge impenetrable pile of crap.

      You can optimize your codebase to be modified by an ever rotating group of people who don't fully understand it, or by a smaller group of people who do. Both are legitimate choices in specific contexts. But if you take a codebase written one way and try to maintain it the other way, your productivity will tank.

    • agent281 5 hours ago

      I feel the same about Javascript codebases. As I like to say, hell is other people's Javascript.

    • wtfparanoid a day ago

      well aligned scala teams are a great thing, impenetrable code is not - maybe a poor choice of adjective?

  • cultofmetatron 11 hours ago

    am I the only one who isn't a little disappointed that this wasn't a library to model the physics of physical lenses?

    • itronitron 8 hours ago

      By "isn't a little disappointed" do you mean very disappointed or not disappointed at all? Although that may not be an important distinction as I think the two groups are roughly the same size.

      • cultofmetatron an hour ago

        woops, I meant to say "is a little dissapointed"

  • itronitron a day ago

    This has nothing to do with optics, which is a branch of physics that studies the behavior and properties of light.

    • solomonb 21 hours ago

      Words can have many uses.

      https://en.wikipedia.org/wiki/Optic_(disambiguation)

      > In computer science, optics are a general class of bidirectional transformations

    • dkarl 17 hours ago

      Virtually everything in computer science is a metaphor. A computer was a human being before it was a machine. A block of memory, an array of values, an index into a structure, most of the vocabulary we use every day is built out of metaphors.

    • wbl 21 hours ago
    • signaru 20 hours ago

      The name, monocle, also further misleads those expecting the physics topic. They actually have a nice logo with a lens and the lambda symbol which is often the symbol used for wavelength.

    • evertedsphere 19 hours ago

      "this has nothing to do with classes, which in sociology and related fields are strata which society can be analysed as being divided into"

    • fn-mote 21 hours ago

      This is why I read the comments before I click on the link. :)

    • Xophmeister a day ago

      It’s a metaphor.

    • dpratt 17 hours ago

      Imagine my disappointment when I spent the time to set up a Cassandra instance and it did not immediately materialize a demigod woman who knew the answers to everything but was cursed to have no one believe her.