Why can't Applebot crawl my website?

1.1k views Asked by At

I am trying to set-up universal links for an iOS app.

The Apple search validator keeps failing when I try to validate the file apple-app-site-association with error message:

Unable to parse that webpage URL. Try a different URL.

The file content is correct, I tried with already validated files from other websites but it seems the crawler fails to access the website generally.

The domain and website are hosted on a shared server at 1and1.com without SSL. The file is not signed.

Any idea why that is?

2

There are 2 answers

2
Alex Bauer On BEST ANSWER

This is the "App Search API Validation Tool", not the "Universal Links Validation Tool" (which doesn't exist from Apple). The results from this tool have no connection to whether Universal Links work or not.

That said, you must have SSL in order for Universal Links to work. That is the number one requirement. If you can't/don't want to set this up, look at an external link hosting service like Firebase Dynamic Links or Branch.io (full disclosure: I'm on the Branch team)

1
John Jason On

In reference to "you must have SSL in order for Universal Links to work. That is the number one requirement." This is no longer a requirement.

If your app runs in iOS 9 or later and you use HTTPS to serve the apple-app-site-association file, you can create a plain text file that uses the application/json MIME type and you don’t need to sign it.