Google says: Chrome User Experience Report aggregates real-world speed data from opted-in users and requires that a URL must be public (crawlable and indexable) and have sufficient number of distinct samples that provide a representative, anonymized view of performance of the URL or origin.
Attaching the link to the article: https://developers.google.com/speed/docs/insights/v5/about#why-is-the-real-user-crux-data-not-available-for-a-url-or-origin
With that said I tried modifying my robots.txt file expecting some improvements in the origin's web-vitals scores.
Attaching screenshot for reference
PS: We are using WordPress
No as per the CrUX docs:
The only way to exclude the data at an origin-level is to host this under another domain (e.g. a sub-domain).
We get this question occasionally with admin sites (e.g. www.example.com/admin) which sometimes don't get as much love and attention as the rest of the site and which people want to exclude (especially as they are not crawlable). However, the origin-level data is supposed to be just that - a view of all URLs in that origin by number of page views. And if those other pages are used sufficiently enough that they impact the origin-level data than that reflects the user experience of that origin.
Google Search Console groups pages differently, attempting to categorise them into different page types. So it may put these into a different group.
CrUX is also intended to be a high-level overview of the site in question. RUM solutions will allow you much more granularity and to slice and dice your performance data as you see fit.