What is the mathematical convention about the use of alphabet letters as variables in algebra?
Why letters?
There are couple of ways to think about this. One way is:
If I were to write "x+3=10" , we used "x" because the number is unknown; we literally don't know what number it is and we gonna solve for it someway and thus, we represent it as "letter x" . It doesn't have to be "x", it could be anything, say something like "? + 3 = 10" or something like "\ud83d\ude03 + 3 = 10" . The unknown could literally be represented by anything, just that Mathematics conventionally use "x, y" and other letters of the alphabet as "variables" to represent unknown values.
Another way to see this:
If I wanna show the relationship between numbers following a particular pattern or sequence, for example, if I am to borrow some money from Jude and I tell Jude, give me $3 and I'll pay back $4, give me $7 and I'll pay back $8, give me $19 and I'll pay back $20, give me $2.2 and I'll pay back $3.2. By the pattern, Jude will easily recognise that I'll pay him back an additional dollar for any amount he gives me.
So, let's say Jude then decides to give me "x" dollars, he'll definitely be expecting "x+1" dollars. I can therefore defined the relationship as
"y = x + 1"
Which follows that Jude gets to receive an additional dollar for every amount he gives me.
So, there's no rule that binds the use of letters as variables representing the unknown, it is mere a convention we've come to be familiar with in Mathematics.
Comments
Leave a comment