Logical matrix
A logical matrix, binary matrix, relation matrix, Boolean matrix, or (0,1) matrix is a matrix with entries from the Boolean domain B = {0, 1}. Such a matrix can be used to represent a binary relation between a pair of finite sets.
Matrix representation of a relation
If R is a binary relation between the finite indexed sets X and Y (so R ⊆ X×Y), then R can be represented by the logical matrix M whose row and column indices index the elements of X and Y, respectively, such that the entries of M are defined by:
In order to designate the row and column numbers of the matrix, the sets X and Y are indexed with positive integers: i ranges from 1 to the cardinality (size) of X and j ranges from 1 to the cardinality of Y. See the entry on indexed sets for more detail.
Example
The binary relation R on the set {1, 2, 3, 4} is defined so that aRb holds if and only if a divides b evenly, with no remainder. For example, 2R4 holds because 2 divides 4 without leaving a remainder, but 3R4 does not hold because when 3 divides 4 there is a remainder of 1. The following set is the set of pairs for which the relation R holds.
- {(1, 1), (1, 2), (1, 3), (1, 4), (2, 2), (2, 4), (3, 3), (4, 4)}.
The corresponding representation as a logical matrix is:
- which includes a diagonal of ones since each number divides itself.
Other examples
- A permutation matrix is a (0,1)-matrix, all of whose columns and rows each have exactly one nonzero element.
- A Costas array is a special case of a permutation matrix
- An incidence matrix in combinatorics and finite geometry has ones to indicate incidence between points (or vertices) and lines of a geometry, blocks of a block design, or edges of a graph (discrete mathematics)
- A design matrix in analysis of variance is a (0,1)-matrix with constant row sums.
- A logical matrix may represent an adjacency matrix in graph theory: non-symmetric matrices correspond to directed graphs, symmetric matrices to ordinary graphs, and a 1 on the diagonal corresponds to a loop at the corresponding vertex.
- The biadjacency matrix of a simple, undirected bipartite graph is a (0,1)-matrix, and any (0,1)-matrix arises in this way.
- The prime factors of a list of m square-free, n-smooth numbers can be described as a m×π(n) (0,1)-matrix, where π is the prime-counting function and aij is 1 if and only if the jth prime divides the ith number. This representation is useful in the quadratic sieve factoring algorithm.
- A bitmap image containing pixels in only two colors can be represented as a (0,1)-matrix in which the 0's represent pixels of one color and the 1's represent pixels of the other color.
- A binary matrix can be used to check the game rules in the game of Go [1]
Some properties
The matrix representation of the equality relation on a finite set is the identity matrix I, that is, the matrix whose entries on the diagonal are all 1, while the others are all 0. More generally, if relation R satisfies I ⊂ R, then R is a reflexive relation.
If the Boolean domain is viewed as a semiring, where addition corresponds to logical OR and multiplication to logical AND, the matrix representation of the composition of two relations is equal to the matrix product of the matrix representations of these relations. This product can be computed in expected time O(n2).[2]
Frequently operations on binary matrices are defined in terms of modular arithmetic mod 2—that is, the elements are treated as elements of the Galois field GF(2) = ℤ2. They arise in a variety of representations and have a number of more restricted special forms. They are applied e.g. in XOR-satisfiability.
The number of distinct m-by-n binary matrices is equal to 2mn, and is thus finite.
Lattice
Let n and m be given and let U denote the set of all logical m × n matrices. Then U has a partial order given by
In fact, U forms a Boolean algebra with the operations and & or between two matrices applied component-wise. The complement of a logical matrix is obtained by swapping all zeros and ones for their opposite.
Every logical matrix a = ( a i j ) has an transpose aT = ( a j i ). Suppose a is a logical matrix with no columns or rows identically zero. Then the matrix product, using Boolean arithmetic, aT a contains the m × m identity matrix, and the product a aT contains the n × n identity.
As a mathematical structure, the Boolean algebra U forms a lattice ordered by inclusion; additionally it is a multiplicative lattice due to matrix multiplication.
Every logical matrix in U corresponds to a binary relation. These listed operations on U, and ordering, correspond to a calculus of relations, where the matrix multiplication represents composition of relations.[3]
Logical vectors
Group-like structures | |||||
---|---|---|---|---|---|
Totalityα | Associativity | Identity | Invertibility | Commutativity | |
Semigroupoid | Unneeded | Required | Unneeded | Unneeded | Unneeded |
Small Category | Unneeded | Required | Required | Unneeded | Unneeded |
Groupoid | Unneeded | Required | Required | Required | Unneeded |
Magma | Required | Unneeded | Unneeded | Unneeded | Unneeded |
Quasigroup | Required | Unneeded | Unneeded | Required | Unneeded |
Unital Magma | Required | Unneeded | Required | Unneeded | Unneeded |
Loop | Required | Unneeded | Required | Required | Unneeded |
Semigroup | Required | Required | Unneeded | Unneeded | Unneeded |
Inverse Semigroup | Required | Required | Unneeded | Required | Unneeded |
Monoid | Required | Required | Required | Unneeded | Unneeded |
Commutative monoid | Required | Required | Required | Unneeded | Required |
Group | Required | Required | Required | Required | Unneeded |
Abelian group | Required | Required | Required | Required | Required |
^α Closure, which is used in many sources, is an equivalent axiom to totality, though defined differently. |
If m or n equals one, then the m × n logical matrix (Mi j) is a logical vector. If m = 1 the vector is a row vector, and if n = 1 it is a column vector. In either case the index equaling one is dropped from denotation of the vector.
Suppose are two logical vectors. The outer product of P and Q results in an m × n rectangular relation:
- A re-ordering of the rows and columns of such a matrix can assemble all the ones into a rectangular part of the matrix. [4]
Let h be the vector of all ones. Then if v is an arbitrary logical vector, the relation R = v hT has constant rows determined by v. In the calculus of relations such an R is called a vector.[4] A particular instance is the universal relation h hT.
For a given relation R, a maximal, rectangular relation contained in R is called a concept in R. Relations may be studied by decomposing into concepts, and then noting the induced concept lattice.
Consider the table of group-like structures, where "unneeded" can be denoted 0, and "required" denoted by 1, forming a logical matrix R. To calculate elements of R RT it is necessary to use the logical inner product of pairs of logical vectors in rows of this matrix. If this inner product is 0, then the rows are orthogonal. In fact, semigroup is orthogonal to loop, small category is orthogonal to quasigroup, and groupoid is orthogonal to magma. Consequently there are 0's in R RT and it fails to be a universal relation.
Row and column sums
Adding up all the 1’s in a logical matrix may be accomplished in two ways, first summing the rows or first summing the columns. When the row-sums are added, the sum is the same as when the column-sums are added. In incidence geometry, the matrix is interpreted as an incidence matrix with the rows corresponding to "points" and the columns as "blocks" (generalizing lines made of points). A row-sum is called its point degree and a column-sum is the block degree. Proposition 1.6 in Design Theory[5] says that the sum of point degrees equals the sum of block degrees.
An early problem in the area was "to find necessary and sufficient conditions for the existence of an incidence structure with given point degrees and block degrees (or in matrix language, for the existence of a (0,1)-matrix of type v × b with given row and column sums."[5] Such a structure is a block design.
See also
- List of matrices
- Binatorix (a binary De Bruijn torus)
- Bit array
- Redheffer matrix
Notes
- Petersen, Kjeld (February 8, 2013). "Binmatrix". Retrieved August 11, 2017.
- Patrick E. O'Neil; Elizabeth J. O'Neil (1973). "A Fast Expected Time Algorithm for Boolean Matrix Multiplication and Transitive Closure". Information and Control. 22 (2): 132–138. doi:10.1016/s0019-9958(73)90228-3. — The algorithm relies on addition being idempotent, cf. p.134 (bottom).
- Irving Copilowish (December 1948). "Matrix development of the calculus of relations", Journal of Symbolic Logic 13(4): 193–203 Jstor link
- Gunther Schmidt (2013). "6: Relations and Vectors". Relational Mathematics. Cambridge University Press. p. 91. doi:10.1017/CBO9780511778810. ISBN 9780511778810.
- Beth, Thomas; Jungnickel, Dieter; Lenz, Hanfried (1986). Design Theory. Cambridge University Press. p. 18.. 2nd ed. (1999) ISBN 978-0-521-44432-3
References
- Hogben, Leslie (2006), Handbook of Linear Algebra (Discrete Mathematics and Its Applications), Boca Raton: Chapman & Hall/CRC, ISBN 978-1-58488-510-8, § 31.3, Binary Matrices
- Kim, Ki Hang (1982), Boolean Matrix Theory and Applications, ISBN 978-0-8247-1788-9
- H. J. Ryser (1957) "Combinatorial properties of matrices of zeroes and ones", Canadian Journal of Mathematics 9: 371–7.
- Ryser, H.J. (1960) "Traces of matrices of zeroes and ones", Canadian Journal of Mathematics 12: 463–76.
- Ryser, H.J. (1960) "Matrices of Zeros and Ones", Bulletin of the American Mathematical Society 66: 442–64.
- D. R. Fulkerson (1960) "Zero-one matrices with zero trace", Pacific Journal of Mathematics 10; 831–6
- D. R. Fulkerson & H. J. Ryser (1961) "Widths and heights of (0, 1)-matrices", Canadian Journal of Mathematics 13: 239–55.
- L. R. Ford Jr. & D. R. Fulkerson (1962) § 2.12 "Matrices composed of 0's and 1's", pages 79 to 91 in Flows in Networks, Princeton University Press MR0159700
External links
Wikimedia Commons has media related to Binary matrix. |
- "Logical matrix", Encyclopedia of Mathematics, EMS Press, 2001 [1994]