Enable sitemap and robots.txt

Currently robots.txt cannot be found (even though it exists) and there is something going wrong when trying to generate the sitemap. The sitemap should be available from https://clarin.eurac.edu/repository/xmlui/htmlmap (cf. https://github.com/ufal/clarin-dspace/commit/1082062fb6d16d98ec32ede5557b3dedde8d9ba9) The error is:

context:/file:///opt/lindat-dspace/installation/webapps/xmlui/sitemap.xmap - 643 : 56<map:read type="SitemapReader">
context:/file:///opt/lindat-dspace/installation/webapps/xmlui/sitemap.xmap - 642 : 50<map:match>

The stacktrace (see attached file sitemap-stacktrace.txt) points to this line where the referenced file does not seem to exist.

Some lines before there is a reference to the config parameter sitemap.dir (ConfigurationManager.getProperty("sitemap.dir")) which is set in this line of dspace.cfg to a non-existing directory.

Simply creating the directory is not enough to just make everything work.