Rank
You are given a matrix which contains only 0 and 1. Furthermore, there are exactly two 1 in every column of this matrix. Calculate the rank of this matrix.
A rank of a matrix is a maximum number k such that there exist a set of k linearly independent rows. A set {v[1]
, ..., v[k]
} is called linearly independent if and only if for any real numbers a[1]
, ..., a[k]
such that a[1]^2
+ ... + a[k]^2
≠ 0 the inequality a[1]
* v[1]
+ ... + a[k]
* v[k]
≠ 0 holds.
For the rows, we assume the use of the standard rules of multiplication by a scalar and addition:
a (u(1), ..., u(m)) = (au(1), ..., au(m)), (u(1), ..., u(m)) + (v(1), ..., v(m)) = (u(1) + v(1), ..., u(m) + v(m)).
A zero row is 0 = (0, ..., 0).
Input
The first line contains two integers n and m (2 ≤ n ≤ 2 * 10^5
, 1 ≤ m ≤ 2 * 10^5
): the number of rows and columns in the matrix. Next 2m lines each contain two integers r and c (1 ≤ r ≤ n, 1 ≤ c ≤ m) specifying the position of a 1 in the matrix. Here r is the row and c is the column. Any cell which was not mentioned in this list contains a zero. It is guaranteed that all pairs (r, c) will be distinct and that there will be exactly two 1 in each column of the matrix.
Output
Print the only number: the rank of the given matrix.