changes to hive's decimal datatype (it could cost you lots of pennies)

When a language, framework, tool, or product changes a basic datatype behavior, it is surely time to see how this might affect you.  Hive did such a thing when version 13 was introduced. 

The issue is documented quite well on the Hive wiki write up for the Decimal datatype, but I give credit to Kevin Risden who raised this concern up to a working group of Hadoop practitioners before it was documented on the wiki.  After an upgrade from Hive 12 to 13 he noticed that his tables that were using the Decimal datatype appeared to have no decimal values post upgrade.  The underlying data was fine; it just displayed incorrectly. 

Again, these Hive wiki details explain in sufficient detail what happened and how it can be fixed with some ALTER TABLE commands, but this is one of those issues that I don't hear folks talking about much although it could be a big issue depending on your utilization of the data.

You will notice that you have to issue an ALTER TABLE command for each table as well as for every single partition (when present).  If you feel like you have an appropriate DECIMAL(precision, scale) setting that could address every Decimal datatype across your entire Hive instance, the David Streever has a nice bit of SQL that can take care of this in one quick action if you run it against the metastore's backing database.