Sunday, November 23, 2008

Let d be the greatest common divisor of two positive integers a and b. If c is a positive integer such that c divides a ,and c divides b, then c <= d

Let d be the greatest common divisor of two positive integers a and b. Prove if c is a positive integer such that c divides a ,and c divides b, then c <= d.

This is a somewhat simple proof since it uses the definition of a greatest common divisor.

A greatest common divisor (GCD) is the largest integer which divides two other integers. Take for example the numbers 24 and 16. 8 is the GCD for 24 and 16 since 8 is the largest number that will divide both of them.

So if the GCD=d then d divides a, and d divides b.

The numbers 4 and 2 also divide 24 and 16, but are not the "Greatest". However, we do see that 4 and 2 also divide 8, which is the second part of the definition of a GCD.

For every positive integer c, if c divides a and c divides b, then c divides d.

It is this last part of the definition we will use for our proof. If c is an integer which divides a and b, then by definition c must also divide the GCD, which is d. And therefore c < d. ~~~~

No comments: