Actually, the next "jump" was from 8 to 10, then 12-bit ADCs in cameras. I do think we'd all agree that 12 bits/channel is better than 8. The remaining bits in the 12-bit word are simple not used. I'm personally dubious of anything over 14 bits - meaningful data, or not?
I doubt Adobe is looking to change things - 15 vs. 16 bits is definitely not a "real world" problem. Sorry if it seemed I was serious.