141 pointsby hyperbrainer16 hours ago8 comments
  • abathologist15 hours ago
    Any one know how Curry (which has a Haskell-like syntax extended to support prologish features) compares with Mercury (which has a Prolog-like syntax extended to support Haskellish features)?
    • sterlind14 hours ago
      Mercury feels like if the Ada people wrote Prolog. it's very verbose. you have to declare signatures in separate files and determinism modes. grounding is strictly enforced. it's statically typed. there's no REPL, remarkably.

      in exchange, the compiler catches a lot of bugs and the code is blazing fast.

      Curry is a superset of Haskell. it takes Haskell's pattern matching and makes it extremely general (full unification), extends it to non-determinism with choice points. it does have a REPL, like ghci.

      Like Haskell, Curry is lazy. Mercury (like Prolog) uses mostly eager, depth-first evaluation (SLDNF resolution.) Clause order doesn't matter in Curry, which uses a strategy of "needed narrowing" - variables are narrowed when they need to be.

      Unlike Mercury (and Prolog), and like Haskell and other FP languages, Curry draws a distinction between function inputs and outputs. You can do relational programming via guards and pattern matching, but it doesn't feel as Prolog-y.

      Curry is more niche than Mercury, which is at least being used to build Souffle (a static analysis language built on Datalog), which is actually being used in industry somewhat. But it's a shame because Curry has a lot to offer, especially to Haskellers. They're both worth checking out though.

    • hyperbrainer14 hours ago
      Not a technical difference, but I think Mercury is somewhat more "commerical" in that it's out of development and can be used in real projects, compared to Curry, which is very much in development.
  • badmonster15 hours ago
    How does Curry manage ambiguity in non-deterministic computations—especially when multiple valid instantiations exist for a free variable?
    • pjmlp15 hours ago
      Probably like Prolog, we get to generate all possible variations.
  • taeric10 hours ago
    That example for "some permutation" is not at all easy for me to understand. I'm assuming I'm just not familiar with the general style?
    • idle_zealot10 hours ago
      I'm unfamiliar as well, but my best guess is that it relies on non-determinism. i.e. both definitions of 'insert' might be valid, and the runtime chooses which to use at random, resulting in either x or y being prepended to the returned list.
      • sterlind9 hours ago
        it's not random. it tries definitions in declaration order until one succeeds. it's then yielded as an assignment of variables and control returns to the caller. if that assignment gets contradicted it will backtrack and try the second definition, so on and so forth. it's more like coroutining.
        • taeric9 hours ago
          Does this definition somehow cause all random permutations of a given list? The definition of "Some Permutation" seems to imply it can be used any place you need to try any/all permutations, one at a time? At the least, repeated calls to this would be different permutations?
          • sterlind6 hours ago
            quick Prolog example because I'm not as familiar with Curry:

            % This generates or recognizes any palindrome: pal --> [_]. pal --> X,pal,X.

            % Here we try it out and press ; to generate more answers. ?- phrase(pal,P). P = [A]; P = [B,A,B]; ...

            % Here we plug in a value and it fails with [A], fails with [B,A,B], etc. until it gets to [D,C,B,A,B,C,D], which can be unified with "racecar." ?- phrase(pal, "racecar") true.

            Another example is just (X=a;X=b),(Y=b;Y=a),X=Y. This has two answers: X=a, Y=a, and X=b,Y=b. What happens is that it first tries X=a, then moves onto the second clause and tries Y=b, then moves onto the third clause and fails, because a≠b! So we backtrack to the last choicepoint, and try Y=a, which succeeds. If we tell Prolog we want more answers (by typing ;) we have exhausted both options of Y, so we'll go back to the first clause and try X=b, then start afresh with Y again (Y=b), and we get the second solution.

            Prolog goes in order, and goes deep. This is notoriously problematic, because it's incomplete. Curry only evaluates choicepoints that a function's output depends on, and only when that output is needed. Curry does have disjunctions (using ? rather than Prolog's ;), unification (by =:= rather than =), and pattern guards rather than clause heads, and the evaluation strategy is different because laziness, but in terms of the fundamentals this is what "non-determinism" means in logic programming. it doesn't mean random, it means decisions are left to the machine to satisfy your constraints.

  • otherayden13 hours ago
    Imagine having your first and last names turn into two separate programming languages lol
  • currando14 hours ago
    The documentation, current-report, is good for learning Curry.

    https://curry-lang.org/docs/report/curry-report.pdf

    Interesting, the email at the end of this thread: https://news.ycombinator.com/item?id=12668591

  • pmarreck15 hours ago
    As is usual with any language that is new to me, would love a comparison of this language, in terms of a number of commonly-valued dimensions, with other languages:

    speed, compare code samples of small algorithms, any notable dependencies, features (immutable data, static typing etc.), etc.

    • TypingOutBugs15 hours ago
      Fwiw, Curry is 30 years old! It looks newer than it is fr the site
  • johnnyjeans13 hours ago
    The comparisons they're making don't make sense to me. I don't think I've ever even seen a logic language without nested expressions. Also VERY weird they give non-determinism as a feature of logic programming. Prolog is the only one off the top of my head that allows for it. Even most Prolog derivatives drop the cut and negation operations. In the broader scope of logic languages, most aren't even turing complete, like Datalog or CLIPS.

    I really feel like Prolog and its horn clause syntax are underappreciated. For as much as lispers will rant and rave about macros, how their code is data, it always struck me as naive cope. How can you say that code is data (outside of the obvious von neumann meaning), but still require a special atomic operation to distinguish the two? In Prolog, there is no such thing as a quote. It literally doesn't make sense as a concept. Code is just data. There is no distinguishing between the two, they're fully unified as concepts (pun intended). It's a special facet of Prolog that only makes sense in its exotic execution model that doesn't even have a concept of a "function".

    For that reason, I tend to have a pessimistic outlook on things like Curry. Static types are nice, and they don't work well with horn clauses (without abusing atoms/terms as a kind of type-system), but it's really not relevant enough to the paradigm that replacing beautiful horn clauses with IYSWIM/ML syntax makes sense to me. Quite frankly, I have great disdain even for Elixir which trades the beautiful Prolog-derived syntax of Erlang for a psuedo-Ruby.

    One thing I really would like to see is further development of the abstract architectures used for logic programming systems. The WAM is cool, but it's absolute ancient and theory has progressed lightyears since it was designed. The interaction calculus, or any graph reduction architecture, promises huge boons for a neo-prolog system. GHC has incidentally paved the way for a brand new generation of logic programming. Sometimes I feel crazy for being the only one who sees it.

    • YeGoblynQueenne8 hours ago
      Curry is very recognisably the functional programmer's conception of what logic programing is, which is the way it's described in the SICP book. Nothing to do with Resolution, Horn clauses, or even unification, instead it's all about DFS with backtracking. Sometimes dictionaries (!) have something to do with it [1].

      I'm speaking from personal experience here. DFS with backtracking has always featured very prominently in discussions I've had with functional programming folks about logic programming and Prolog and for a while I didn't understand why. Well it's because they have an extremely simplified, reductive model of logic programming in mind. As a consequence there's a certain tendency to dismiss logic programming as overly simplistic. I remember a guy telling me the simplest exercise in some or other of the classic functional programming books is implementing Prolog in (some kind of) Lisp and it's so simple! I told him the simplest exercise in Prolog is implementing Prolog in Prolog but I don't think he got what I meant because what the hell is a Prolog meta-interpreter anyway [2]?

      I've also noticed that functional programmers are scared of unification - weird pattern matching on both sides, why would anyone ever need that? They're also freaked out by the concept of logic varibles and what they call "patterns with holes" like [a,b,C,D,_,E] which are magickal and mysterious, presumably because you have to jump through hoops to do something like that in Lisp. Like you have to jump through hoops to treat your code as data, as you say.

      And of course if you drop Resolution, you drop SLD-Resolution, and if you drop SLD-Resolution you drop the Horn clauses, whose big advantage is that they make SLD-Resolution a piece of cake. Hence the monstrous abomination of "logic programming" languages that look like ... Haskell. Or sometimes like Scheme.

      Beh, rant over. It's late. Go to sleep grandma. yes yes you did it all with Horn clauses in your time yadda yadda...

      ___________

      [1] Like in this MIT lecture by H. Abelson, I believe with G. Sussman looking on:

      https://youtu.be/rCqMiPk1BJE?si=VBOWeS-K62qeWax8

      [2] It's a Prolog interpreter written in Prolog. Like this:

        prove(true):-
          !. %OMG
        prove((Literal,Literals):-
          prove(Literal)
         ,prove(Literals).
        prove(Literal):-
          Literal \= (_,_)
         ,clause(Literal,Body)
         ,prove(Body).
      
      Doubles as a programmatic definition of SLD-Resolution.
      • johnnyjeans7 hours ago
        a prolog wizard crossing the path is an exceedingly rare and brilliant event, im compelled to make a wish upon this shooting star :3

        > I remember a guy telling me the simplest exercise in some or other of the classic functional programming books is implementing Prolog in (some kind of) Lisp and it's so simple!

        it's really easy to underestimate just how well engineered prolog's grammar is, because it's so deceptively simple. the only way you're getting simpler is like, assembly. and it's a turing equivalent kind of machine, but because if you squint your eyes you can delude yourself into thinking it kind of looks procedural, people can fool themselves into satisfaction that they "get" it, without actually getting it.

        but the moment NAF and resolution as a concept clicks, it's like you brushed up against the third rail of the universe. it's insane to me we let these paradigms rot in the stuffy archives of history. the results this language pulls with natural language processing should raise any sensible person's alarm bells to maximum volume: something is Very Different here. if lisp comes from another planet, prolog came from an alternate dimension. technological zenith will be reached when we push a prolog machine into an open time-like curve and make our first hypercomputation.

    • spencerflem4 hours ago
      What's your take on Finite Choice Logic Programming / Dusa btw?

      Been messing with it & Answer Set Programming recently and still trying to work out my own thoughts on it