So we basicaly are almost done with our website and we started indexation process. everything is added to GSC (starting from today so it's a fresh topic).
General idea is to reditect users based on their user-agent language to proper website version (if you come from Germany we redirect you to domain.de
version, if from poland to .pl one and so on).
After adding sitemap.xml from .co.uk
version of website I've noticed that Google Search Console returns a lot of errors with information that URLs are blocked by robots.txt (I guess it's because till today service was locked).
To go a littlebit deeper with the case I've put some URLs from sitemap.xml into "Fetch as Google" tool and noticed that it returns information about redirect to .com (x-default) version of website.
Is it because:
1. We have x-default URL in <HEAD> pointing to our .com domain (if we don't have your language version).
2. Google has problems with this kind of solution.
3. Other problem?
I didn't thought it will be so hard - based on how Skyscanner.com does it. :)
I can share the URL with you gusy on private message and discuss about it a littlebit more.