I've inserted this line using the "Header Code" property of the page. D.H.E. cannot produce this code for now, maybe in future. It is not necessary, this should be the default.
As Davide says it's not necessary and the example you gave is the default setting. You are better of creating a seperate robots.txt file that contains the settings you want.
For example User-agent: * Disallow: /mytestfiles/ Disallow: /images/ Disallow: /cms/
This example shows you that all robots may crawl everything except for the folders named mytestfiles, images and cms.