The word 'Deutschland' really only acquired its present meaning after the founding of the German Empire except when antcipated by scholars and poets. However, Emperor Maximilian of Hapsburg and Martin Luther prepared the way for imbuing the word with a patriotic or nationalistic resonance.