(a) Let us take an arbitrary x∈A∪B . We have: x∈A∪B⇔(x∈A)∨(x∈B)⇒(x∈A)∨(x∈B)∨(x∈C)⇔x∈A∪B∪C
Thus, we receive that (A∪B)⊆(A∪B∪C)
(b) We take an arbitrary x∈A∩B∩C . By definition, we have
x∈A∩B∩C⇔x∈A∧x∈B∧x∈C⇒x∈A∧x∈B⇔
⇔x∈A∩B . Thus, we receive A∩B∩C⊆A∩B .
(c) We assume that the sign "-" denotes the subtraction of sets. Then for an arbitrary x∈(A−B)−C we have:
x∈(A−B)−C⇔x∈(A−B)∧x∈/C⇔x∈A∧x∈/B∧x∈/C⇒x∈A∧x∈/C⇒x∈(A−C)
Thus, we got: (A−B)−C⊆(A−C)
(d) For an arbitrary x∈(A−C)∩(C−B) we have:
x∈(A−C)∩(C−B)⇔x∈(A−C)∧x∈(C−B)⇔
x∈A∧x∈/C∧x∈C∧x∈/B . The latter means that the set (A−C)∩(C−B) is empty. Thus, (A−C)∩(C−B)=∅
(e) Let us provide a counterexample: B=C={1},A=∅ .
I.e., B and C consist of one natural number and set A is empty. Then, (B−A)∪(C−B)={1}
Thus, the statement is false.
Comments