Ok I finally solved this problem.
The issue was mapreduce.application.classpath. In the Ambari upgrade docs for going from 2.1 to 2.2, there’s a section for setting all the new classpaths. In particular:
$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*: $PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*: $PWD/mr-framework/hadoop/share/hadoop/common/*: $PWD/mr-framework/hadoop/share/hadoop/common/lib/*: $PWD/mr-framework/hadoop/share/hadoop/yarn/*: $PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*: $PWD/mr-framework/hadoop/share/hadoop/hdfs/*: $PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*: /usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure$
I, like most people, copy and pasted that value into Ambari. All those paths are correct. The problem is on this site only, some of the classpaths are separate by a space after the colon. This was causing the error. Map-Reduce now runs correctly.
This one was a HUGE pain in the ass and took a really long time to identify and fix. This page needs to be fixed so this property is correct (without the spaces). It would also be helpful if Hadoop had more meaningful error messages (instead of just generic Java class missing errors).