102 pointsby mcyc7 days ago9 comments
  • OscarCunningham4 days ago
    We also know the optimal circuits if you want to compute two boolean functions from four variables at the same time: https://cp4space.hatsya.com/2020/06/30/4-input-2-output-bool....
  • AaronFriel4 days ago
    Surprised not to see Karnaugh maps mentioned here, as a tool for humans to intuitively find these simplifications.

    https://en.m.wikipedia.org/wiki/Karnaugh_map

  • Sharlin4 days ago
    The standard Floyd–Warshall is fairly easily parallelizable. I wonder how fast you could solve this problem with today's GPUs, and whether a(6) might be attainable in some reasonable time.
  • dooglius4 days ago
    Could one do this directly with transistors or standard cells? Seems very useful for ASICs, particularly structured ASICs which are mapped from FPGA lookup tables of size 4-6.
    • o11c4 days ago
      This isn't quite as useful in practice as it seems, since NOT isn't always free, you almost always can eliminate common subexpressions, and gates with more than two inputs are often cheaper than doing everything with two-input gates.
  • lilyball4 days ago
    The example parity function for 3 variables appears to be flipped. Instead of being true if the number of true inputs is odd, it's true if the number of true inputs is even.
  • fallat3 days ago
    How is Russ so f'ing cracked. The brain on this human. 99.9% of us will never touch his intelligence.
  • senderista4 days ago
    (From 2011)
  • cluckindan4 days ago
    Using the * operator for AND is very non-standard. Unicode provides ¬ for negation, ∧ for conjunction and ∨ for disjunction. These are commonly used in CS literature, along with bar(s) over variables or expressions to denote negation, which are definitely a mixed bag for readability.
    • _kb3 days ago
      From what I’ve had exposure to conjunction, disjunction, and negation symbols are common if you’re discussing logic [1].

      Boolean algebra then use product, sum, and complement [2].

      Both can express the same thing. In this case `*` is easier to type than `·`.

      [1]: https://simple.industries/notes/propositions.html

      [2]: https://simple.industries/notes/boolean-algebra.html

    • dse19824 days ago
      Isn't the AND operation often represented using multiplication notation (dot or star) because it is basically a boolean multiplication?
      • WorldMaker4 days ago
        It's not so much that it is "boolean multiplication" (because how do you define that, also because digital representation of booleans implies that integer multiplication still applies) so much as AND follows similar Laws as multiplication, in particular AND is distributive across OR in a similar way multiplication is distributive over addition. [Example: a * (b + c) <=> a * b + a * c] Because it follows similar rules, it helps with some forms of intuition of patterns when writing them with the familiar operators.

        It's somewhat common in set notations to use * and + for set union and set intersection for very similar reasons. Some programming languages even use that in their type language (a union of two types is A * B and an intersection is A + B).

        Interestingly, this is why Category Theory in part exists to describe the similarities between operators in mathematics such as how * and ∧ contrast/are similar. Category Theory gets a bad rap for being the origin of monads and fun phrases like "monads are a monoid in the category of endofunctors", but it also answers a few fun questions like why are * and ∧ so similar? (They are similar functions that operate in different "categories".) Admittedly that's a very rough, lay gloss on it, but it's still an interesting perspective on what people talk about when they talk about Category Theory.

        • wasabi9910113 days ago
          Do you really need to introduce category theory for that?

          Seems like overkill, abstract algebra seems sufficient to categorize both boolean logic and integer operations as having the common structure of a ring.

          • WorldMaker3 days ago
            Of course you don't "need" to introduce category theory for that, which is why I saved it for fun at the end. I just think it is neat. It's also one of those bridges to "category theory is simpler than it sounds", which is also why I disagree with it being "overkill" in general in part because that keeps category theory in the "too complex for real needs" box, which I think is the wrong box. Which, case in point:

            > […] abstract algebra seems sufficient to categorize both boolean logic and integer operations as having the common structure of a ring.

            I don't think Ring Theory is any easier than Category Theory to learn/teach, I rather think that Category Theory is a subset of some of best parts of abstract algebra, especially Group Theory, boiled down to the sufficient parts to describe (among other things) practical function composition tools for computing.

        • AlotOfReading3 days ago
          I would normally interpret "Boolean multiplication" as multiplication over GF(2), where + would be XOR. This notation is fairly common when discussing things like cryptography or CRCs.
        • dse19824 days ago
          Thx for your thorough explanation! I don’t know much about these things, just thought about similarities in the algebraic properties, especially with regards to the zero-element: 0*1=0.
        • yorwba3 days ago
          > digital representation of booleans implies that integer multiplication still applies

          Yes. Multiplication of unsigned 1-bit integers is the same function as boolean AND.

        • Ar-Curunir3 days ago
          boolean multiplication is well-defined: it is multiplication mod 2, which is exactly the AND operator.
    • bee_rider4 days ago
      It is not so uncommon to see it represented by a dot. I guess a star is like a dot, but doesn’t require finding any weird keys. It isn’t ideal but it is obvious enough what they mean.
  • 3 days ago
    undefined