I want to make an application that will have the content of the page delivered by a web API and fetched by knockoutjs. How should I handle pages that I want to be crawled, is there a library that makes static html pages when a crawler enters the website and is properly integrated in sails.js?
It would also be great if I could leave the !#
out of the url but it's not a necesity
Here's one solution: Most of the major websites use this.
If your frontend uses pseudo pages then you can easily route those urls to some actual pages and use any templating engine.
If not one trick is to have a div tag with all the content which is send along with res.render . And the javascript code will remove that content.
You can also inlude meta description tags.
More Details: 1) Add custom routes of all the pages you want the crawler to notice.
2) Create another simple view template using an engine such as jade or ejs.
3) Call the API functions internally, get the data and render the view
res.view('simpleView', dataFromDb);
4) That view template has some javascript which will hide that content.
5) The knockout.js then render the content as usual.