




























































































Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
These are lecture notes for MATH240, Discrete Mathematics, at Hendrix College. The notes cover topics such as introduction to Disco, propositional logic connectives, and more. The notes also include examples and exercises for students to work on. The document could be useful as study notes or lecture notes for university students studying Discrete Mathematics.
Typology: Lecture notes
1 / 105
This page cannot be seen from the preview
Don't miss anything!
These are my lecture notes for MATH 240, Discrete Mathematics, at Hendrix College.
Go around, greet everyone and learn their names.
Introduce myself (with slide show). Explain what “Discrete Math” is all about. Show class website. Homework: fill out survey, look at syllabus and bring questions. In pairs: discuss helpful/unhelpful strategies from previous math class(es). Helpful strategies:
Other humans! Work together, office hours.
Skim the chapter ahead of time.
Work for understanding, not just memorization.
Show Disco in replit.com. Get them all to bring it up, play around with it, talk to neighbors, share insights. Go over some of the basics.
Note it does not matter whether the result would actually be negative or not. Disco does not actually simplify/evaluate things to figure out what type they are. In fact, the whole point is to be able to tell what kind of value we will get before actually trying to evaluate an expression.
Disco> :type 3 - 1 3 - 1 : Z
Similarly, N is not closed under division, and Disco has other types to handle that, just like it has Z to handle subtraction, but we’ll talk about them later. We can define our own functions in Disco. First, we give the name of the function and its type, which will be something like A -> B where A is the type of the function’s inputs, and B is the type of the outputs. Then we define what output the function should give for each input.
double : N -> N double(n) = 2n
If we write the above definition of double in a .disco file, we can load it at the Disco prompt using the :load command, then try it on some inputs:
Disco> :load double.disco Loading double.disco... Loaded. Disco> double(2) 4
We can also attach tests to our functions.
!!! double(0) =!= 0 !!! double(2) =!= 4 !!! double(7) =!= 14 !!! forall n : N. double(n) >= 0 double : N -> N double(n) = 2n
Tests can either be simple true or false tests, or they can have a forall to say that something should be true for every value of a certain type. When we :load this file, Disco will run the tests and report whether they succeed.
Disco> :load double.disco Loading double.disco... Running tests... double: OK Loaded.
If we try changing the double(n) >= 0 test to double(n) > 0, we can see that the test fails:
Disco> :load double.disco Loading double.disco... Running tests... double:
However, another way to express a correct test would be as follows:
!!! forall n : N. (n == 0) or double(n) > 0
As a final example, let’s implement factorial (although it is already built in). Recall that n! = n · (n − 1) · (n − 2) · · · · · 1. Another way to write this is
n! = n · (n − 1)!
If we add a “base case” 0! = 1, this becomes a perfectly viable way to define factorial. We could transcribe it into Disco as follows:
fac : N -> N fac(0) = 1 fac(n) = n * fac (n-1)
Unfortunately, this is a type error. The reason is that since the input to fac is supposed to be a natural number, that means n − 1 must be a natural number, but it cannot be since it uses subtraction. We can use the .- operator instead, which works on natural numbers (you will explore what this operator does for homework).
fac : N -> N fac(0) = 1 fac(n) = n * fac (n .- 1)
Negation
Definition 3.2. Let p be a proposition. The negation of p is written ¬p (or p) and read “not p” or “It is not the case that p.”
We can make a truth table showing the truth value of ¬p for each possible truth value of p:
p ¬p T F F T
Conjunction
Definition 3.3. The conjunction of propositions p, q is written p ∧ q (“p and q”). It is true when both p and q are true, and false otherwise.
Let’s make a truth table. Notice that the truth table for ¬p needed two rows, one for T and one for F. The truth table for p ∧ q will need four rows, one for each possible combination of truth values for p and q.
p q p ∧ q T T T T F F F T F F F F
Example. Conjunction works just like we expect in natural language. Consider: “It is not raining today and I had eggs for breakfast.” This is true if both parts are true. If it’s raining, the statement is false. If I ate something other than eggs, it’s false. Of course it’s definitely false when it’s raining and I ate something other than eggs. If we let r = “It is raining today” and e = “I had eggs for breakfast”, then we can translate this sentence into propositional logic as
¬r ∧ e.
Note that negation has “higher precedence” than conjunction (it has “stickier glue”) so this is unambiguous. It does not mean ¬(r ∧ e), which would be different. If in doubt, just use parentheses: (¬r) ∧ e.
Disjunction
Definition 3.4. The disjunction of p, q is written p ∨ q (“p or q”). It is false when both are false, and true otherwise.
p q p ∨ q T T T T F T F T T F F F
The symbols ∧ and ∨ are easy to mix up. Just remember that ∧ looks like a capital “A” for “And”, ∨ looks like a capital “V” for “Vote” (a vote is when you choose between options). There’s a good reason the symbols are upside down versions of each other: they are “opposite” (the mathy word is “dual”) in the sense that if we consider an opposite world where everything false becomes true and vice versa, then ∧ becomes ∨ and vice versa. Just look at their definitions: note how the definition of ∨ is the same as that for ∧ but with all the T’s and F’s switched. The way we usually think about p ∨ q, though, is that p ∨ q is true whenever at least one of p, q is true. Which statement from the side board does this correspond to? (The one about prerequisites.) We use the word “or” with two different meanings in English. This one, where it’s OK for both things to be true, is called inclusive or. The other is called exclusive or, and written p ⊕ q. It is true when exactly one of p, q is true, and false otherwise. (Put another way, it is true when p and q are different, and false when they are the same.) Exclusive or is very important in computer science, but rarely comes up in mathematical logic. We’ll return to it later in the course perhaps.
Example. Let r = “It is raining today”, e = “I had eggs for breakfast”, and c = “I had cereal for breakfast.” Consider the sentence “Either it is raining, or I ate cereal and not eggs for breakfast.” How do we translate this in to propositional logic? r ∨ (c ∧ ¬e)
(Notes: not clear which kind of or to use; it doesn’t matter too much. The parenthesis may technically not be required, but it’s best to include them.) What about “If I ate eggs for breakfast, then it is raining”?
So overall we have the following truth table:
p q p → q T T T T F F F T T F F T
In other words, the only scenario in which an implication p → q is false is when p is true but q is false. It is true in all other cases. “If the premise is false, anything goes!” Note that we have lots of ways to express conditionals in natural language besides just “if p then q”. For example, “if p, (then) q”, “p implies q”, “q if p”, “p only if q”, and so on. If you’re not sure, make a truth table!
Definition 4.2. Given an implication p → q, we have:
the converse is q → p
the inverse is ¬p → ¬q
the contrapositive is ¬q → ¬p (the contrapositive is the inverse of the converse or vice versa)
Theorem 4.3. An implication and its contrapositive always have the same truth value. On the other hand, an implication does not always have the same truth value as its inverse or converse.
Talk to your neighbor and convince yourself that an implication has the same truth value as its contrapositive. How would we prove this?
Definition 4.4. Let p, q be propositions. The biconditional p ↔ q (“p if and only if q”) is true when p and q have the same truth value, and false otherwise.
Remark. We often abbreviate “if and only if” as “iff”.
Definition 4.5. A proposition that is always true, no matter the values of the propositional variables it contains, is a tautology. One that is always false is a contradiction.
Example. Get the class to come up with some examples of tautologies, contra- dictions, and some that are neither.
How would we prove these? Use logical reasoning, or use a truth table. Or use Disco!
Definition 4.6. Two propositions that always have the same truth values are logically equivalent, written p ≡ q. Alternatively, p ≡ q when p ↔ q is a tautology.
Example. Show that p → q ≡ ¬p ∨ q. Make a truth table: p q p → q ¬p ¬p ∨ q T T T F T T F F F F F T T T T F F T T T
Notice that the columns for p → q and ¬p ∨ q are exactly the same, which means that these two propositions always have the same truth value. Hence they are logically equivalent.
Example. Show that p → q ≡ ¬q → ¬p.
with no parentheses, since where we put parentheses doesn’t matter. (But if we mix ∧ and ∨ we definitely need parentheses.) You should convince yourself of all these, using either a truth table or think- ing about a logical argument. From now on, we don’t have to use truth tables anymore, we can prove any equivalence we want using these core equivalences. Here are a couple examples.
Example. Prove (p → q) ∧ (p → r) ≡ p → (q ∧ r).
(p → q) ∧ (p → r) ≡ { Implication } (¬p ∨ q) ∧ (¬p ∨ r) ≡ { Factor out ¬p (distributivity, backwards) } ¬p ∨ (q ∧ r) ≡ { Implication } p → (q ∧ r)
S’20: Only made it to here. Can do more examples if time.
Example. Show ¬(p ∨ (¬p ∧ q)) ≡ ¬p ∧ ¬q.
¬(p ∨ (¬p ∧ q)) ≡ { DeMorgan } ¬p ∧ ¬(¬p ∧ q) ≡ { DeMorgan } ¬p ∧ (¬¬p ∨ ¬q) ≡ { Double negation elimination } ¬p ∧ (p ∨ ¬q) ≡ { Distributivity } (¬p ∧ p) ∨ (¬p ∧ ¬q) ≡ { Contradiction } F ∨ (¬p ∧ ¬q) ≡ { Identity } ¬p ∧ ¬q
If yet more time left, show (p ∧ q) → (p ∨ q) is a tautology, by deriving equivalence to T.
By this point we’ve caught up with the ancient Greeks. However, it turns out that propositional logic isn’t expressive enough to talk about everything we want to reason about mathematically. In particular we can’t make general sorts of statements involving words like “all” or “some”. We are able to write tests with forall in Disco; let’s think more formally about what this means.
Consider the statement x + 2 = 5.
We actually used this as an example on the first day of class: it is not a propo- sition because we can’t say whether it is true or false; it depends on x.
Disco> :test x + 2 == 5 Error: there is nothing named x. https://disco-lang.readthedocs.io/en/latest/reference/unbound.html
However, it’s almost a proposition: it’s “waiting for” a value of x to be filled in, at which point it will be a proposition. We can make this explicit by defining P (x) to be the statement “x + 2 = 5”; then P (x) is a predicate, that is, a function that takes x as input and outputs a proposition.
P : N -> Bool P(x) = x + 2 == 5
Example.
P (2) is the proposition 2 + 2 = 5 (which happens to be false).
P (3) is the proposition 3 + 2 = 5 (which happens to be true).
We can also make multi-argument predicates, just like we can have functions of multiple variables. For example, let T (x, y) mean “x is less than the square of y”.
T : N * N -> Bool T(x,y) = x < y^
Definition 6.1. If P (x) is a predicate and D is some “domain” (i.e. set of values, i.e. type) then ∀x : D. P (x)
(pronounced “for all x in D, P (x)”) is a proposition which is true iff P (x) is true for every x in the domain D.
Remark. Dually to ∀, we can think of ∃ as a giant “or”, that is,
∃x : D. P (x) ≡ P (x 1 ) ∨ P (x 2 ) ∨ P (x 3 ) ∨...
even though it is not actually defined this way.
Everything follows from thinking about ∀ in terms of ∧ and ∃ in terms of ∨! First we have the De Morgan laws for quantifiers: ¬(∀x : D. P (x)) ≡ ∃x : D. ¬P (x) ¬(∃x : D. P (x)) ≡ ∀x : D. ¬P (x)
There are other equivalences such as
∀x : D. (P (x) ∧ Q(x)) ≡ (∀x : D. P (x)) ∧ (∀x : D. Q(x)) ∃x : D. (P (x) ∨ Q(x)) ≡ (∃x : D. P (x)) ∨ (∃x : D. Q(x))
Translate each English sentence into formal logic.
Example. “Every natural number is less than or equal to its own square.”
∀n : N.n ≤ n^2
Example. “1369 is a perfect square.”
∃n : N.n^2 = 1369
Example. Something with nested quantifiers...
Example. “Every student in Discrete Math has taken calculus.”
We could write ∀s : student in Discrete. C(s).
We could also write ∀s : student at Hendrix. D(s) → C(s), where D(s) means student s is in Discrete. You should convince yourself this is logi- cally equivalent to the previous one! Alternatively, we could write ∀s : student in Discrete. T (s, Calculus).
Example. “There is a weekday when everyone turns in their homework.” Let H(s, d) means that student s turns in their homework on day s. Then we can write ∃d : Weekday. ∀s : Student. H(s, d).
Alternatively, if we use the domain of all days instead of weekdays,
∃d : day. Weekday(d) ∧ ∀s : Student. H(s, d).
Example. Goldbach conjecture: “every even number greater than 2 is equal to the sum of two prime numbers.” This is a very advanced example!
What is a proof? In a court of law, a proof is convincing evidence. In math we desire something more logically ironclad.
Definition 7.1. A proof is a logically valid argument that establishes the truth of a statement.
We distinguish two types of proofs. A formal proof:
Uses only axioms (assumptions) and things previously proved.
Consists of a series of steps where each step is a valid logical inference from previous steps or assumptions.
Is often expressed in formal notation.
We will not write these! The only places you will ever see a real, formal proof is (1) in a geometry class, or (2) a computer-checked formal proof (these are really interesting, take my Functional Programming class if you want to learn more!). In contrast, an informal proof: Uses only axioms and things previously proved (same as formal proofs!).
May omit or combine steps.
Is often expressed in natural language.
In principle could be expanded into a complete formal proof.
What is a valid logical inference? We will leave this mostly to our intuition. (Your book covers it in section 1.6 but it’s overwhelming and unhelpful, IMO.) You have been using valid logical inferences your whole life. We’ll give just two examples:
(A → B) ∧ A → B ≡ T.
(A ∧ B) → A ≡ T.
Picking up from last time...
8.0.1 p ↔ q
To prove an if and only if, p ↔ q, prove (p → q) ∧ (q → p).
8.0.2 ¬p
To prove a negation ¬p:
8.0.3 ∀x : D.P (x)
To prove a “universally quantified” statement ∀x : D. P (x):
8.0.4 ∃x : D.P (x)
To prove an “existentially quantified” statement ∃x : D. P (x):
Example. Prove: if n is an odd integer, then n^2 is also odd.
First step: translate to predicate logic! This looks like an if-then statement, so it’s going to be of the form p → q, right? WRONG! Notice that it’s talking about a variable n. It is really making a statement about all possible values of n. This is common in mathematics, to make a “for all” statement just by mentioning some variables without explicitly saying a word like “all” or “every”. So the correct translation is really
∀n : integer. Odd(n) → Odd(n^2 ).
Now, what does Odd(n) mean?
Definition 8.1. An integer n is odd if there is some integer k such that n = 2 k + 1. It is even if there is some integer k such that n = 2k.
So let’s prove ∀n : integer. Odd(n) → Odd(n^2 ). We start by introducing an arbitrary integer n; then we have to prove Odd(n) → Odd(n^2 ). This is an implication, so we prove it by supposing that Odd(n) is true, and then proving in that case Odd(n^2 ) is true as well. We can expand the definition of Odd to figure out what our assumption Odd(n) means and what we have to show for Odd(n^2 ). The whole thing goes like this:
Proof. Let n be an arbitrary integer; we must show that Odd(n) → Odd(n^2 ). So suppose Odd(n) is true, that is, there is an integer k such that n = 2k + 1. Then we must show that Odd(n^2 ) is true. n^2 = (2k + 1)^2 = 4k^2 + 4k + 1 = 2(2k^2 + 2k) + 1, which is of the form 2j + 1, and hence n^2 is odd. SDG
The above proof has a lot of detail to help us keep track of what is going on. A more experienced mathematician might write something more like this:
Proof. Let n be an odd integer, and suppose n = 2k + 1. Then n^2 = (2k + 1)^2 = 4 k^2 + 4k + 1 = 2(2k^2 + 2k) + 1, so n^2 is odd as well. SDG
But you are welcome and encouraged to include a lot of detail about what you are doing especially as you are starting out.
Example. Prove that if n is an integer and 3n + 2 is odd, then n is odd.
Again, the first step is to translate into predicate logic:
∀n : integer. Odd(3n + 2) → Odd(n).
Let’s try proving it.
Proof. Let n be an arbitrary integer, and suppose 3n + 2 is odd; we must show n is odd. If 3n + 2 is odd, then by definition there is some integer k such that 3 n + 2 = 2k + 1. Solving for n, we find that n = 2 k 3 +1... but it is unclear where to go from here. We need to show that n is of the form 2j + 1 for some j, but it doesn’t really look like that. We declare this proof attempt a failure. SDG
However, we have another technique at our disposal for proving an implica- tion: prove the contrapositive!
Proof. Let n be an arbitrary integer; we will show the contrapositive of the statement “if 3n + 2 is odd then n is odd”; that is, we will prove that if n is even (i.e. if it is not odd), then 3n + 2 is also even. So suppose n is even; then there must be some integer k such that n = 2k. Now 3n + 2 = 3(2k) + 2 = 6k + 2 = 2(3k + 1), which is two times an integer; hence 3n + 2 is even. SDG