That's not what the phrase "reasonable doubt" refers to. |
It's simply means "a doubt especially about the guilt of a criminal defendant that arises or remains upon fair and thorough consideration of the evidence or lack thereof" from Webster Dictionary. Maybe it has a different mathematical term that I'm not aware?
What you mean to say is that you've done simple algebra so much that you can do it easily. |
No, basic algebra is simple common sense. For example, if you made a deal with someone that whenever you let them borrow money, they'd return you the money doubled, then it's common sense that if you let them borrow X amount, that they should return 2X. This can be turned into a simple algebraic equation, but the whole thing was already common sense beforehand.
When I was in any Algebra class, I never felt like I was learning, it just felt like simple logic put in mathematical terms. This isn't to say all of algebra is this way, some of it is beyond common sense. However, the basics of algebra is deep rooted in common sense.
You can't apply common sense to algebra because it's entirely removed from your everyday experience. |
This is entirely false in the way you mean it. All of middle school and high school was a bunch of word problems that were algebra in disguise. Algebra as a subject is completely mathematical and doesn't care about common sense, but the opposite isn't true. Parts of common sense and the entire world is based upon mathematical principles.
At no point will you hold a system of linear equations in your hand. |
At no point will you ever hold a tangible representation of the English language in your hands. You can't hold common sense in your hands either. It's not a very impressive argument. You can hold something which can be represented by a linear equation, but how can the fact that you can't hold an idea in your hands prove that algebra is removed from everyday experience? Algebra as a concept doesn't care about the world, it obviously can't, but it was created because it can REPRESENT things in the world - the same as common sense.
I assure you if the teacher were to start from first principles and then proceed to given nothing but proofs, you would leave the class having understood nothing new. Simply put, proofs are a mathematical tool, not a teaching tool. |
If a proof cannot present new information, what's the point? That's my hatred with them, they serve no purpose except for maybe a few rare cases I could imagine. This is especially true when all a proof does is use what's already known to create a statement which is fundamentally true, and therefore can be wrong since it may not take into account a variable that has not yet been discovered.
A proof doesn't need to show how the mathematician figured out how to prove the conjecture. |
Another reason why it's useless. Imagine telling someone X is true, and when they ask why you give them an explanation that'll go over their heads unless they already had the knowledge which would have made them not ask to begin with.
Likewise, the person reading the proof doesn't need to understand how the proof was arrived at. As long as the premises are true and the argument is valid, the proof is successful. |
That's not logical. If the person reading doesn't know where the proof was derived from, then they've completely missed out on how some of the premises were reached, in which case how can they verify that the premises were true? In simple proofs, the premises are easy to verify, but in more complex ones, it looks like they pulled things out of thin air and you can't verify it without trying the proof yourself (and have sufficient knowledge to do so).
It depends on the axiomatic system being used |
With the assumptions made and the conclusion trying to be proved, it was very clear.
As a simple example, what iterated addition is equivalent to pi * e? |
The proof restricted us to whole numbers, otherwise even and odd wouldn't be possible. And in that situation, it's simply applied addition. You could argue that pi * e is applied addition. You'd have pi and e lined up to as many digits as you want to calculate to, then whenever you'd of multiplied two numbers, simply add X amount of times. you could write a For-Loop to accomplish this. Then you'd of achieved multiplication with only addition. Though this is a kind of a hack I suppose.
When you introduce extra assumptions you need to check what happens when those assumptions are false, to ensure that the final result doesn't become invalid |
What?? I didn't bring in any extra assumptions. The question said prove that if P is prime and p doesn't divide a, then the GCD is 1. If P doesn't divide A, then how can A = P? I used no new assumptions, it was a direct proof. If P doesn't divide A, that means the only denominator left is 1. P can only divide A if P is < or = A, and it doesn't divide A. There shouldn't be anything else needed. By the very definition of a prime number, the proof should have been done.
you could have avoided the issue. |
There shouldn't have been an issue. I can't predict mental retardation, there's no thought process to predict. What's a valid answer today won't be tomorrow with these professors and TAs.