I don’t think so tbh. Japan did a lot of horrendous things back then, but it’s still not nearly as infamous as Nazi Germany at that time. The only time I hear or see any mention of that is usually when someone praises anything Japanese and the other person wants to show how extremely highly educated they are by noting that Japan did something terrible more than half a century ago.
America downplayed it because it launched straight into the Cold War alliance propaganda and remembering it would be inconvenient while trying to build alliances against China and Russia.
I don’t know what Germany teaches about the invasion of China and the occupation of Korea and the Pacific islands, but I can certainly see why both East and West Germany would teach it in school.