It is mathematically the most "economical" integer base in terms of space, when considering complexity of digit representation along with word length [1].
Of course the "optimal one" would be base "e" but nobody is going to be able to do anything remotely reasonable with it... (pg. 491 of that manuscript).
Well, you could... if you want to give up discrete integer representation...
Anyway, if ternary is actually optional in practice depends as much on how efficient the ternary logic can be implemented on a silicon process. If you increase integer storage efficiency by 10%, but circuit density decreases, maybe you're not getting enough value.
Okay but what's the significance of (non-unity) powers of e? e is the optimum of a specific function[1], and you lose the benefits for higher values.
It's true that you can conveniently represent the numbers by using integer powers of the base (like representing binary with hexadecimal), but the whole reason e is a "good" base in the first place is because you're (partially) minimizing the number of distinct symbols needed to represent a number, and you lose that once you go to a higher base.
Plus, using a slightly-off integer like 7 breaks the integer-power-mapping anyway.
I'm replying from memory, so this might be worth researching for yourself, but I remember a cryptography conference where someone was talking about how important it was to use constant-time algorithms to prevent leaking information about the number of bits that were set in keys and messages, and demonstrated that using balanced ternary allowed some really cool techniques.
Don't know if any of that has made it into the literature, but you could have a look.