“-” is an ambiguous character. In mathematics, there are several different entities with the same “minus sign” appearance. Programming languages generally only have the devastatingly ambiguous ASCII/Unicode “hyphen-minus” to work with, and only enough context to distinguish between unary and binary forms, so they are necessarily limited in their ability to interpret its meaning and usually must make a choice to support a subset of the available possible meanings. i.e., programming languages are not mathematics.
Although I agree they should attempt to be consistent with mathematics when possible, despite using a lot of similar visual representations and semantics, programming languages are necessarily distinct formal systems with their own rules.
In unary form preceding numeric digits, AppleScript uses hyphen-minus as part of numeric literals, not as a numeric negation operator. So, that’s the literal “-2” squared, not the _expression_ “2^2” negated. i.e., it isn’t about “operator precedence” so much as it is that the unary hyphen-minus isn’t an operator in this case. This is true in several programming languages, if that’s any comfort. (One could argue this is merely an implementation detail, just an indirect description of the operator precedence order, but I think if you keep this model in mind it will help when reading code.)
In “-(2^2)” the hyphen-minus is the numeric negation operator, because the parenthesis explicitly indicate that it is not part of a numeric literal.
Of course, what it really comes down to is that AppleScript was designed this way and if we changed it now it would break compatibility with existing scripts.
As a more general issue: This is one more example of why it is imperative that more programming languages escape the shackles of ASCII’s long-archaic limitations and support Unicode in source. Then programming languages could, for example, distinguish between minus sign “−” for subtraction and superscript minus “⁻” for indicating negative numeric literals.