What is the mathematical convention about the use of alphabet letters as variables in algebra?
Why letters?
There are couple of ways to think about this. One way is:
If I were to write , we used because the number is unknown; we literally don't know what number it is and we gonna solve for it someway and thus, we represent it as . It doesn't have to be , it could be anything, say something like or something like . The unknown could literally be represented by anything, just that Mathematics conventionally use and other letters of the alphabet as to represent unknown values.
Another way to see this:
If I wanna show the relationship between numbers following a particular pattern or sequence, for example, if I am to borrow some money from Jude and I tell Jude, give me $3 and I'll pay back $4, give me $7 and I'll pay back $8, give me $19 and I'll pay back $20, give me $2.2 and I'll pay back $3.2. By the pattern, Jude will easily recognise that I'll pay him back an additional dollar for any amount he gives me.
So, let's say Jude then decides to give me dollars, he'll definitely be expecting dollars. I can therefore defined the relationship as
Which follows that Jude gets to receive an additional dollar for every amount he gives me.
So, there's no rule that binds the use of letters as variables representing the unknown, it is mere a convention we've come to be familiar with in Mathematics.
Comments