diff --git a/source b/source index 8d97a837fcf..448c4b728fc 100644 --- a/source +++ b/source @@ -2509,6 +2509,19 @@ a.setAttribute('href', 'https://example.com/'); // change the content attribute syntax is defined in Media Fragments URI. MEDIAFRAG

+
URL Pattern
+ +
+

The following terms are defined in URL Pattern: URLPATTERN

+ + +
HTTP and related specifications
@@ -2549,8 +2562,10 @@ a.setAttribute('href', 'https://example.com/'); // change the content attribute

The following terms are defined in MIME Sniffing: MIMESNIFF

@@ -2582,6 +2597,7 @@ a.setAttribute('href', 'https://example.com/'); // change the content attribute +
The No-Vary-Search HTTP Response Header Field
+ +
+

The following terms are defined in The No-Vary-Search HTTP Response Header + Field: NOVARYSEARCH

+ + +
+
Paint Timing
@@ -4085,6 +4117,9 @@ a.setAttribute('href', 'https://example.com/'); // change the content attribute

The following terms are defined in Selectors: SELECTORS

The following features are defined in CSS Values and Units: @@ -4530,6 +4567,7 @@ a.setAttribute('href', 'https://example.com/'); // change the content attribute

The following terms are defined in Storage: STORAGE

+ +
"speculationrules"
+
+
    +
  1. Let result be the result of creating a speculation rules parse result given source text + and document.

  2. + +
  3. Mark as ready el given result.

  4. +
+
@@ -64774,6 +64955,15 @@ document.body.append(script1, script2); data-x="concept-script-result">result.

+ +
"speculationrules"
+
+
    +
  1. Register speculation rules given el's relevant global + object and el's result.

  2. +
+
@@ -107306,6 +107496,17 @@ location.href = '#foo';
navigationParams's response, and "pre-media".

+
  • +

    If navigationParams's navigable + is a top-level traversable, then process the `Speculation-Rules` + header given document and navigationParams's response.

    + +

    This is conditional because speculative loads are only considered for + top-level traversables, so it would be wasteful to fetch these rules otherwise.

    +
  • +
  • Potentially free deferred fetch quota for document.

  • Return document.

  • @@ -108276,6 +108477,1984 @@ new PaymentRequest(…); // Allowed to use href="https://github.com/whatwg/html/issues/6905">issue #6905.

    +

    Speculative loading

    + +

    Speculative loading is the practice of performing navigation actions, such as prefetching, + ahead of navigation starting. This makes subsequent navigations faster.

    + +

    Developers can initiate speculative loads by using speculation rules. User agents might also perform speculative loads in certain + implementation-defined scenarios, such as typing into the address bar.

    + +

    Speculation rules

    + +

    Speculation rules are how developers instruct the + browser about speculative loading operations that the developer believes will be beneficial. They + are delivered as JSON documents, via either:

    + + + +
    +

    The following JSON document is parsed into a speculation rule set specifying a + number of desired conditions for the user agent to start a referrer-initiated navigational + prefetch:

    + +
    {
    +  "prefetch": [
    +    {
    +      "urls": ["/chapters/5"]
    +    },
    +    {
    +      "eagerness": "moderate",
    +      "where": {
    +        "and": [
    +          { "href_matches": "/*" },
    +          { "not": { "selector_matches": ".no-prefetch" } }
    +        ]
    +      }
    +    }
    +  ]
    +}
    +
    + +

    A JSON document representing a speculation rule set must meet the following + speculation rule set authoring requirements:

    + + + +

    A valid speculation rule is a JSON object that meets the following requirements:

    + + + +

    A valid document rule predicate is a JSON object that meets the following + requirements:

    + + + +

    A valid URL pattern input is either:

    + + + +
    Data model
    + +

    A speculation rule set is a struct with the following items:

    + + + +

    In the future, other rules will be possible, e.g., prerender rules. See + Prerendering Revamped for such not-yet-accepted extensions. + PRERENDERING-REVAMPED

    + +

    A speculation rule is a struct with the following items:

    + + + +
    + +

    A document rule predicate is one of the following:

    + + + +

    A document rule conjunction is a struct with the following items:

    + + + +

    A document rule disjunction is a struct with the following items:

    + + + +

    A document rule negation is a struct with the following items:

    + + + +

    A document rule URL pattern predicate is a struct with the following + items:

    + + + +

    A document rule selector predicate is a struct with the following items:

    + + + +
    + +

    A speculation rule eagerness is one of the following strings:

    + +
    +
    "immediate"
    +

    The developer believes that performing the associated speculative loads is very likely to + be worthwhile, and they might also expect that load to require significant lead time to complete. + User agents should usually enact the speculative load candidate as soon as practical, subject + only to considerations such as user preferences, device conditions, and resource limits.

    + +
    "eager"
    +

    User agents should enact the speculative load candidate on even a slight suggestion that + the user may navigate to this URL in the future. For instance, the user might have moved the + cursor toward a link or hovered it, even momentarily, or paused scrolling when the link is one of + the more prominent ones in the viewport. The author is seeking to capture as many navigations as + possible, as early as possible.

    + +
    "moderate"
    +

    User agents should enact the candidate if user behavior suggests the user may navigate to + this URL in the near future. For instance, the user might have scrolled a link into the viewport + and shown signs of being likely to click it, e.g., by moving the cursor over it for some time. + The developer is seeking a balance between "eager" and + "conservative".

    + +
    "conservative"
    +

    User agents should enact the candidate only when the user is very likely to navigate to + this URL at any moment. For instance, the user might have begun to interact with a link. The + developer is seeking to capture some of the benefits of speculative loading with a fairly small + tradeoff of resources.

    +
    + +

    A speculation rule eagerness A is less eager than another speculation rule + eagerness B if A follows B in the above list.

    + +

    A speculation rule eagerness A is at least as eager as another speculation rule + eagerness B if A is not less + eager than B.

    + +
    + +

    A speculation rule tag is either an ASCII string whose code points are all in the range U+0020 to U+007E inclusive, or + null.

    + +

    This code point range restriction ensures the value can be sent in an HTTP header + with no escaping or modification.

    + +
    + +

    A speculation rule requirement is the string "anonymous-client-ip-when-cross-origin".

    + +

    In the future, more possible requirements might be defined.

    + +
    Parsing
    + +
    +

    Since speculative loading is a progressive enhancement, this standard is fairly conservative + in its parsing behavior. In particular, unknown keys or invalid values usually cause parsing + failure, since it is safer to do nothing than to possibly misinterpret a speculation rule.

    + +

    That said, parsing failure for a single speculation rule still allows other speculation rules + to be processed. It is only in the case of top-level misconfiguration that the entire speculation + rule set is discarded.

    +
    + +

    To parse a speculation rule set string given a string input, + a Document document, and a URL baseURL:

    + +
      +
    1. Let parsed be the result of parsing a JSON string to an Infra value given input.

    2. + +
    3. If parsed is not a map, then throw a + TypeError indicating that the top-level value needs to be a JSON object.

      + +
    4. Let result be a new speculation rule set.

    5. + +
    6. Let tag be null.

    7. + +
    8. +

      If parsed["tag"] exists:

      + +
        +
      1. If parsed["tag"] is not a speculation rule + tag, then throw a TypeError indicating that the speculation rule tag is + invalid.

        + +
      2. Set tag to parsed["tag"].

      3. +
      +
    9. + +
    10. Let typesToTreatAsPrefetch be « "prefetch" ».

    11. + +
    12. +

      The user agent may append "prerender" to typesToTreatAsPrefetch.

      + +

      Since this specification + only includes prefetching, this allows user agents to treat requests for prerendering as + requests for prefetching. User agents which implement prerendering, per the Prerendering + Revamped specification, will instead interpret these as prerender requests. + PRERENDERING-REVAMPED

      +
    13. + +
    14. +

      For each type of + typesToTreatAsPrefetch:

      + +
        +
      1. +

        If parsed[type] exists:

        + +
          +
        1. +

          If parsed[type] is a list, then for each rule of parsed[type]:

          + +
            +
          1. Let rule be the result of parsing + a speculation rule given rule, tag, document, and + baseURL.

          2. + +
          3. If rule is null, then continue.

          4. + +
          5. Append rule to result's + prefetch rules.

          6. +
          +
        2. + +
        3. Otherwise, the user agent may report a warning to the console indicating + that the rules list for type needs to be a JSON array.

        4. +
        +
      2. +
      +
    15. + +
    16. Return result.

    17. +
    + +

    To parse a speculation rule given a map + input, a speculation rule tag rulesetLevelTag, a + Document document, and a URL baseURL:

    + +
      +
    1. +

      If input is not a map:

      + +
        +
      1. The user agent may report a warning to the console indicating that the rule + needs to be a JSON object.

      2. + +
      3. Return null.

      4. +
      +
    2. + +
    3. +

      If input has any key other than "source", "urls", "where", "relative_to", "eagerness", "referrer_policy", "tag", "requires", "expects_no_vary_search", or "target_hint":

      + +
        +
      1. The user agent may report a warning to the console indicating that the rule + has unrecognized keys.

      2. + +
      3. Return null.

      4. +
      + +

      "target_hint" has no impact on the processing model in this standard. However, + implementations of Prerendering Revamped can use it for prerendering rules, and so + requiring user agents to fail parsing such rules would be counterproductive. + PRERENDERING-REVAMPED.

      +
    4. + +
    5. Let source be null.

    6. + +
    7. If input["source"] exists, then set source to input["source"].

    8. + +
    9. Otherwise, if input["urls"] exists and input["where"] does not exist, then set source to "list".

    10. + +
    11. Otherwise, if input["where"] exists and input["urls"] does not exist, then set source to "document".

    12. + +
    13. +

      If source is neither "list" nor "document":

      + +
        +
      1. The user agent may report a warning to the console indicating that a source + could not be inferred or an invalid source was specified.

      2. + +
      3. Return null.

      4. +
      +
    14. + +
    15. Let urls be an empty list.

    16. + +
    17. Let predicate be null.

    18. + +
    19. +

      If source is "list":

      + +
        +
      1. +

        If input["where"] exists:

        + +
          +
        1. The user agent may report a warning to the console indicating that there + were conflicting sources for this rule.

        2. + +
        3. Return null.

        4. +
        +
      2. + +
      3. +

        If input["relative_to"] exists:

        + +
          +
        1. +

          If input["relative_to"] is neither "ruleset" nor "document":

          + +
            +
          1. The user agent may report a warning to the console indicating that the + supplied relative-to value was invalid.

          2. + +
          3. Return null.

          4. +
          +
        2. + +
        3. If input["relative_to"] is "document", then set baseURL to document's + document base URL.

        4. +
        +
      4. + +
      5. +

        If input["urls"] does not exist or is not a list:

        + +
          +
        1. The user agent may report a warning to the console indicating that the + supplied URL list was invalid.

        2. + +
        3. Return null.

        4. +
        +
      6. + +
      7. +

        For each urlString of input["urls"]:

        + +
          +
        1. +

          If urlString is not a string:

          + +
            +
          1. The user agent may report a warning to the console indicating that the + supplied URL must be a string.

          2. + +
          3. Return null.

          4. +
          +
        2. + +
        3. Let parsedURL be the result of URL parsing + urlString with baseURL.

        4. + +
        5. +

          If parsedURL is failure, or parsedURL's scheme is not an HTTP(S) scheme:

          + +
            +
          1. The user agent may report a warning to the console indicating that the + supplied URL string was unparseable.

          2. + +
          3. Continue.

          4. +
          +
        6. + +
        7. Append parsedURL to + urls.

        8. +
        +
      8. +
      +
    20. + +
    21. +

      If source is "document":

      + +
        +
      1. +

        If input["urls"] or input["relative_to"] exists:

        + +
          +
        1. The user agent may report a warning to the console indicating that there + were conflicting sources for this rule.

        2. + +
        3. Return null.

        4. +
        +
      2. + +
      3. +

        If input["where"] does not exist, then set predicate to a document rule + conjunction whose clauses is an empty + list.

        + +

        Such a predicate will match all links.

        +
      4. + +
      5. Otherwise, set predicate to the result of parsing a document rule predicate given input["where"], document, and baseURL.

      6. + +
      7. If predicate is null, then return null.

      8. +
      +
    22. + +
    23. Let eagerness be "immediate" if + source is "list"; otherwise, "conservative".

    24. + +
    25. +

      If input["eagerness"] exists:

      + +
        +
      1. +

        If input["eagerness"] is not a speculation rule + eagerness:

        + +
          +
        1. The user agent may report a warning to the console indicating that the + eagerness was invalid.

        2. + +
        3. Return null.

        4. +
        +
      2. + +
      3. Set eagerness to input["eagerness"].

      4. +
      +
    26. + +
    27. Let referrerPolicy be the empty string.

    28. + +
    29. +

      If input["referrer_policy"] exists:

      + +
        +
      1. +

        If input["referrer_policy"] is not a referrer + policy:

        + +
          +
        1. The user agent may report a warning to the console indicating that the + referrer policy was invalid.

        2. + +
        3. Return null.

        4. +
        +
      2. + +
      3. Set referrerPolicy to input["referrer_policy"].

      4. +
      +
    30. + +
    31. Let tags be an empty ordered set.

    32. + +
    33. If rulesetLevelTag is not null, then append + rulesetLevelTag to tags.

    34. + +
    35. +

      If input["tag"] exists:

      + +
        +
      1. +

        If input["tag"] is not a speculation rule + tag:

        + +
          +
        1. The user agent may report a warning to the console indicating that the + tag was invalid.

        2. + +
        3. Return null.

        4. +
        +
      2. + +
      3. Append input["tag"] + to tags.

      4. +
      +
    36. + +
    37. If tags is empty, then append null to tags.

    38. + +
    39. Assert: tags's size is either 1 or + 2.

    40. + +
    41. Let requirements be an empty ordered set.

    42. + +
    43. +

      If input["requires"] exists:

      + +
        +
      1. +

        If input["requires"] is not a list:

        + +
          +
        1. The user agent may report a warning to the console indicating that the + requirements were not understood.

        2. + +
        3. Return null.

        4. +
        +
      2. + +
      3. +

        For each requirement of + input["requires"]:

        + +
          +
        1. +

          If requirement is not a speculation rule requirement:

          + +
            +
          1. The user agent may report a warning to the console indicating that the + requirement was not understood.

          2. + +
          3. Return null.

          4. +
          +
        2. + +
        3. Append requirement to + requirements.

        4. +
        +
      4. +
      +
    44. + +
    45. Let noVarySearchHint be the default URL search variance.

    46. + +
    47. +

      If input["expects_no_vary_search"] exists:

      + +
        +
      1. +

        If input["expects_no_vary_search"] is not a + string:

        + +
          +
        1. The user agent may report a warning to the console indicating that the + `No-Vary-Search` hint was invalid.

        2. + +
        3. Return null.

        4. +
        +
      2. + +
      3. Set noVarySearchHint to the result of parsing a URL search variance given input["expects_no_vary_search"].

      4. +
      +
    48. + +
    49. +

      Return a speculation rule with:

      + +
      +
      URLs
      +
      urls
      + +
      predicate
      +
      predicate
      + +
      eagerness
      +
      eagerness
      + +
      referrer policy
      +
      referrerPolicy
      + +
      tags
      +
      tags
      + +
      requirements
      +
      requirements
      + +
      No-Vary-Search hint
      +
      noVarySearchHint
      +
      +
    50. +
    + +

    To parse a document rule predicate given a value input, a + Document document, and a URL baseURL:

    + +
      +
    1. +

      If input is not a map:

      + +
        +
      1. The user agent may report a warning to the console indicating that the + document rule predicate was invalid.

      2. + +
      3. Return null.

      4. +
      +
    2. + +
    3. +

      If input does not contain exactly one of "and", "or", "not", "href_matches", or "selector_matches":

      + +
        +
      1. The user agent may report a warning to the console indicating that the + document rule predicate was empty or ambiguous.

      2. + +
      3. Return null.

      4. +
      +
    4. + +
    5. Let predicateType be the single key found in the previous step.

    6. + +
    7. +

      If predicateType is "and" or "or":

      + +
        +
      1. +

        If input has any key other than + predicateType:

        + +
          +
        1. The user agent may report a warning to the console indicating that the + document rule predicate had unexpected extra options.

        2. + +
        3. Return null.

        4. +
        +
      2. + +
      3. +

        If input[predicateType] is not a list:

        + +
          +
        1. The user agent may report a warning to the console indicating that the + document rule predicate had an invalid clause list.

        2. + +
        3. Return null.

        4. +
        +
      4. + +
      5. Let clauses be an empty list.

      6. + +
      7. +

        For each rawClause of + input[predicateType]:

        + +
          +
        1. Let clause be the result of parsing a document rule predicate given rawClause, + document, and baseURL.

        2. + +
        3. If clause is null, then return null.

        4. + +
        5. Append clause to + clauses.

        6. +
        +
      8. + +
      9. If predicateType is "and", then return a + document rule conjunction whose clauses is + clauses.

      10. + +
      11. Return a document rule disjunction whose clauses is clauses.

      12. +
      +
    8. + +
    9. +

      If predicateType is "not":

      + +
        +
      1. +

        If input has any key other than "not":

        + +
          +
        1. The user agent may report a warning to the console indicating that the + document rule predicate had unexpected extra options.

        2. + +
        3. Return null.

        4. +
        +
      2. + +
      3. Let clause be the result of parsing a document rule predicate given + input[predicateType], document, and + baseURL.

      4. + +
      5. If clause is null, then return null.

      6. + +
      7. Return a document rule negation whose clause is clause.

      8. +
      +
    10. + +
    11. +

      If predicateType is "href_matches":

      + +
        +
      1. +

        If input has any key other than "href_matches" or "relative_to":

        + +
          +
        1. The user agent may report a warning to the console indicating that the + document rule predicate had unexpected extra options.

        2. + +
        3. Return null.

        4. +
        +
      2. + +
      3. +

        If input["relative_to"] exists:

        + +
          +
        1. +

          If input["relative_to"] is neither "ruleset" nor "document":

          + +
            +
          1. The user agent may report a warning to the console indicating that the + supplied relative-to value was invalid.

          2. + +
          3. Return null.

          4. +
          +
        2. + +
        3. If input["relative_to"] is "document", then set baseURL to document's + document base URL.

        4. +
        +
      4. + +
      5. Let rawPatterns be input["href_matches"].

      6. + +
      7. If rawPatterns is not a list, then set rawPatterns to + « rawPatterns ».

      8. + +
      9. Let patterns be an empty list.

      10. + +
      11. +

        For each rawPattern of + rawPatterns:

        + +
          +
        1. Let pattern be the result of building a URL pattern from an Infra value given rawPattern + and baseURL. If this step throws and exception, catch the exception and set + pattern to null.

        2. + +
        3. +

          If pattern is null:

          + +
            +
          1. The user agent may report a warning to the console indicating that the + supplied URL pattern was invalid.

          2. + +
          3. Return null.

          4. +
          +
        4. + +
        5. Append pattern to + patterns.

        6. +
        +
      12. + +
      13. Return a document rule URL pattern predicate whose patterns is patterns.

      14. +
      +
    12. + +
    13. +

      If predicateType is "selector_matches":

      + +
        +
      1. +

        If input has any key other than "selector_matches":

        + +
          +
        1. The user agent may report a warning to the console indicating that the + document rule predicate had unexpected extra options.

        2. + +
        3. Return null.

        4. +
        +
      2. + +
      3. Let rawSelectors be input["selector_matches"].

      4. + +
      5. If rawSelectors is not a list, then set rawSelectors + to « rawSelectors ».

      6. + +
      7. Let selectors be an empty list.

      8. + +
      9. +

        For each rawSelector of + rawSelectors:

        + +
          +
        1. Let parsedSelectorList be failure.

        2. + +
        3. If rawSelector is a string, then set parsedSelectorList to the + result of parsing a selector given + rawSelector.

        4. + +
        5. +

          If parsedSelectorList is failure:

          + +
            +
          1. The user agent may report a warning to the console indicating that the + supplied selector list was invalid.

          2. + +
          3. Return null.

          4. +
          +
        6. + +
        7. For each selector of + parsedSelectorList, append selector + to selectors.

        8. +
        +
      10. + +
      11. Return a document rule selector predicate whose selectors is selectors.

      12. +
      +
    14. + +
    15. Assert: this step is never reached, as one of the previous branches was + taken.

    16. +
    + + +
    Processing model
    + +

    A speculative load candidate is a struct with the following items:

    + + + +

    A prefetch candidate is a speculative load candidate with the following + additional item:

    + + + +

    A prefetch IP anonymization policy is either null or a cross-origin prefetch + IP anonymization policy.

    + +

    A cross-origin prefetch IP anonymization policy is a struct whose + single item is its origin, an + origin.

    + +
    + +

    A speculative load candidate candidateA is redundant with another speculative load + candidate candidateB if the following steps return true:

    + +
      +
    1. If candidateA's No-Vary-Search hint + is not equal to candidateB's No-Vary-Search + hint, then return false.

    2. + +
    3. If candidateA's URL is not + equivalent modulo search variance to candidateB's URL given candidateA's No-Vary-Search hint, then return false.

    4. + +
    5. Return true.

    6. +
    + +
    +

    The requirement that the No-Vary-Search hints be + equivalent is somewhat strict. It means that some cases which could theoretically be treated as + matching, are not treated as such. Thus, redundant speculative loads could happen.

    + +

    However, allowing more lenient matching makes the check no longer an equivalence relation, and + producing such matches would require an implementation strategy that does a full comparison, + instead of a simpler one using normalized URL keys. This is in line with the best practices for + server operators, and attendant HTTP cache implementation notes, in No + Vary Search § 6 Comparing.

    + +

    In practice, we do not expect this to cause redundant speculative loads, since server + operators and the corresponding speculation rules-writing web developers will follow best + practices and use static `No-Vary-Search` header values/speculation rule hints.

    +
    + +
    +

    Consider three speculative load + candidates:

    + +
      +
    1. A has a URL of https://example.com?a=1&b=1 and a No-Vary-Search hint parsed from params=("a").

    2. + +
    3. B has a URL of https://example.com?a=2&b=1 and a No-Vary-Search hint parsed from params=("b").

    4. + +
    5. C has a URL of https://example.com?a=2&b=2 and a No-Vary-Search hint parsed from params=("a").

    6. +
    + +

    With the current definition of redundant + with, none of these candidates are redundant with each other. A speculation rule + set which contained all three could cause three separate speculative loads.

    + +

    A definition which did not require equivalent No-Vary-Search hints could consider A and + B to match (using A's No-Vary-Search + hint), and B and C to match (using B's No-Vary-Search hint). But it could not consider + A and C to match, so it would not be transitive, and thus not an + equivalence relation.

    +
    + +
    + +

    Every Document has speculation rule sets, a + list of speculation rule sets, initially + empty.

    + +

    Every Document has a consider speculative loads microtask queued, a + boolean, initially false.

    + +

    To consider speculative loads for a Document document:

    + +
      +
    1. +

      If document's node navigable is not a top-level + traversable, then return.

      + +

      Supporting speculative loads into child + navigables has some complexities and is not currently defined. It might be possible to + define it in the future.

      +
    2. + +
    3. If document's consider speculative loads microtask queued is true, + then return.

    4. + +
    5. Set document's consider speculative loads microtask queued to + true.

    6. + +
    7. +

      Queue a microtask given document to run the following steps:

      + +
        +
      1. Set document's consider speculative loads microtask queued to + false.

      2. + +
      3. Run the inner consider speculative loads steps for + document.

      4. +
      +
    8. +
    + +

    In addition to the call sites explicitly given in this standard:

    + + + +
    +

    In this standard, every call to consider speculative loads is given just a + Document, and the algorithm re-computes all possible candidates in a stateless way. + A real implementation would likely cache previous computations, and pass along information from + the call site to make updates more efficient. For example, if an a element's href attribute is changed, that specific element could be + passed along in order to update only the related speculative load candidate.

    + +

    Note that because of how consider speculative loads queues a microtask, by the + time the inner consider speculative loads steps are run, multiple updates (or cancelations) might be processed + together.

    +
    + +

    The inner consider speculative loads steps for a Document + document are:

    + +
      +
    1. If document is not fully active, then return.

    2. + +
    3. Let prefetchCandidates be an empty list.

    4. + +
    5. +

      For each ruleSet of document's speculation rule sets:

      + +
        +
      1. +

        For each rule of ruleSet's prefetch rules:

        + +
          +
        1. Let anonymizationPolicy be null.

        2. + +
        3. If rule's requirements contains "anonymous-client-ip-when-cross-origin", then set + anonymizationPolicy to a cross-origin prefetch IP anonymization + policy whose origin is document's origin.

        4. + +
        5. +

          For each url of rule's URLs:

          + +
            +
          1. Let referrerPolicy be the result of computing a speculative load referrer policy given + rule and null.

          2. + +
          3. +

            Append a new prefetch candidate with

            + +
            +
            URL
            +
            url
            + +
            No-Vary-Search hint
            +
            rule's No-Vary-Search hint
            + +
            eagerness
            +
            rule's eagerness
            + +
            referrer policy
            +
            referrerPolicy
            + +
            tags
            +
            rule's tags
            + +
            anonymization policy
            +
            anonymizationPolicy
            +
            + +

            to prefetchCandidates.

            +
          4. +
          +
        6. + +
        7. +

          If rule's predicate is not null:

          + +
            +
          1. Let links be the result of finding + matching links given document and rule's predicate.

          2. + +
          3. +

            For each link of links:

            + +
              +
            1. Let referrerPolicy be the result of computing a speculative load referrer policy given + rule and link.

            2. + +
            3. +

              Append a new prefetch candidate + with

              + +
              +
              URL
              +
              link's url
              + +
              No-Vary-Search hint
              +
              rule's No-Vary-Search hint
              + +
              eagerness
              +
              rule's eagerness
              + +
              referrer policy
              +
              referrerPolicy
              + +
              tags
              +
              rule's tags
              + +
              anonymization policy
              +
              anonymizationPolicy
              +
              + +

              to prefetchCandidates.

              +
            4. +
            +
          4. +
          +
        8. +
        +
      2. +
      +
    6. + +
    7. +

      For each prefetchRecord of + document's prefetch records:

      + +
        +
      1. If prefetchRecord's source is + not "speculation rules", then continue.

      2. + +
      3. Assert: prefetchRecord's state is not "canceled".

      4. + +
      5. If prefetchRecord is not still being speculated given + prefetchCandidates, then cancel + and discard prefetchRecord given document.

      6. +
      +
    8. + +
    9. Let prefetchCandidateGroups be an empty list.

    10. + +
    11. +

      For each candidate of + prefetchCandidates:

      + +
        +
      1. Let group be « candidate ».

      2. + +
      3. Extend group with all items in prefetchCandidates, apart from candidate itself, + which are redundant with candidate + and whose eagerness is at least as eager as candidate's + eagerness.

      4. + +
      5. If prefetchCandidateGroups contains + another group whose items are the same as group, + ignoring order, then continue.

      6. + +
      7. Append group to + prefetchCandidateGroups.

      8. +
      + +
      +

      The following speculation rules generate two redundant prefetch candidates:

      + +
      {
      +  "prefetch": [
      +    {
      +      "tag": "a",
      +      "urls": ["next.html"]
      +    },
      +    {
      +      "tag": "b",
      +      "urls": ["next.html"],
      +      "referrer_policy": "no-referrer"
      +    }
      +  ]
      +}
      + +

      This step will create a single group containing them both, in the given order. (The second + pass through will not create a group, since its contents would be the same as the first group, + just in a different order.) This means that if the user agent chooses to execute the "may" step + below to enact the group, it will enact the first candidate, and ignore the second. Thus, the + request will be made with the default referrer policy, instead of using "no-referrer".

      + +

      However, the collect tags from speculative load candidates algorithm will + collect tags from both candidates in the group, so the `Sec-Speculation-Tags` + header value will be `"a", "b"`. This indicates to server operators that + either rule could have caused the speculative load.

      +
      +
    12. + +
    13. +

      For each group of + prefetchCandidateGroups:

      + +
        +
      1. +

        The user agent may run the following steps:

        + +
          +
        1. Let prefetchCandidate be group[0].

        2. + +
        3. Let tagsToSend be the result of collecting tags from speculative load candidates given + group.

        4. + +
        5. +

          Let prefetchRecord be a new prefetch record with

          + +
          +
          source
          +
          "speculation rules"
          + +
          URL
          +
          prefetchCandidate's URL
          + +
          No-Vary-Search hint
          +
          prefetchCandidate's No-Vary-Search + hint
          + +
          referrer policy
          +
          prefetchCandidate's referrer + policy
          + +
          anonymization policy
          +
          prefetchCandidate's anonymization policy
          + +
          tags
          tagsToSend
          +
          +
        6. + +
        7. Start a referrer-initiated navigational prefetch given + prefetchRecord and document.

        8. +
        + +

        When deciding whether to execute this "may" step, user agents should consider + prefetchCandidate's eagerness, in + accordance to the current behavior of the user and the definitions of speculation rule + eagerness.

        + +

        prefetchCandidate's No-Vary-Search + hint can also be useful in implementing the heuristics defined for the + speculation rule eagerness values. For example, a user hovering of a link whose + URL is equivalent modulo search + variance to prefetchCandidate's URL + given prefetchCandidate's No-Vary-Search + hint could indicate to the user agent that performing this step would be useful.

        + +

        When deciding whether to execute this "may" step, user agents should prioritize user + preferences (express or implied, such as data-saver or battery-saver modes) over the eagerness + supplied by the web developer.

        +
      2. +
      +
    14. +
    + +

    To compute a speculative load referrer policy given a speculation rule + rule and an a element, area element, or null + link:

    + +
      +
    1. If rule's referrer policy is not the + empty string, then return rule's referrer + policy.

    2. + +
    3. If link is null, then return the empty string.

    4. + +
    5. Return link's hyperlink referrer policy.

    6. +
    + +

    To collect tags from speculative load candidates given a list of speculative load candidates candidates:

    + +
      +
    1. Let tags be an empty ordered set.

    2. + +
    3. +

      For each candidate of + candidates:

      + +
        +
      1. For each tag of candidate's + tags: append + tag to tags.

      2. +
      +
    4. + +
    5. Sort in ascending order tags, with + tagA being less than tagB if tagA is null, or if tagA + is code unit less than tagB.

    6. + +
    7. Return tags.

    8. +
    + +
    + +

    To find matching links given a Document document and a + document rule predicate predicate:

    + +
      +
    1. Let links be an empty list.

    2. + +
    3. +

      For each shadow-including descendant + descendant of document, in shadow-including tree order:

      + +
        +
      1. If descendant is not an a or area element with an + href attribute, then continue.

      2. + +
      3. +

        If descendant is not being rendered or is part of skipped contents, then continue.

        + +

        Such links, though present in document, aren't available for the + user to interact with, and thus are unlikely to be good candidates. In addition, they might + not have their style or layout computed, which might make selector matching less efficient in + user agents which skip some or all of that work for these elements.

        +
      4. + +
      5. If descendant's url is null, or + its scheme is not an HTTP(S) scheme, then + continue.

      6. + +
      7. If predicate matches + descendant, then append descendant to + links.

      8. +
      +
    4. + +
    5. Return links.

    6. +
    + +

    A document rule predicate predicate matches an a or area element + el if the following steps return true, switching on predicate's type:

    + +
    +
    document rule conjunction
    +
    +
      +
    1. +

      For each clause of predicate's + clauses:

      + +
        +
      1. If clause does not match + el, then return false.

      2. +
      +
    2. + +
    3. Return true.

    4. +
    +
    + +
    document rule disjunction
    +
    +
      +
    1. +

      For each clause of predicate's + clauses:

      + +
        +
      1. If clause matches el, + then return true.

      2. +
      +
    2. + +
    3. Return false.

    4. +
    +
    + +
    document rule negation
    +
    +
      +
    1. If predicate's clause matches el, then return false.

    2. + +
    3. Return true.

    4. +
    +
    + +
    document rule URL pattern predicate
    +
    +
      +
    1. +

      For each pattern of predicate's + patterns:

      + +
        +
      1. If performing a match given pattern + and el's url gives a non-null value, + then return true.

      2. +
      +
    2. + +
    3. Return false.

    4. +
    +
    + +
    document rule selector predicate
    +
    +
      +
    1. +

      For each selector of predicate's + selectors:

      + +
        +
      1. If performing a match given + selector and el with the scoping root set to + el's root returns sucess, then return true.

      2. +
      +
    2. + +
    3. Return false.

    4. +
    +
    +
    + +
    + +

    Speculation rules features use the speculation rules task source, which is a + task source.

    + +

    Because speculative loading is generally less important than processing tasks for + the purpose of the current document, implementations might give tasks enqueued here an especially low priority.

    + +

    Navigational prefetching

    + +

    For now, the navigational prefetching process is defined in the Prefetch + specification. Moving it into this standard is tracked in issue #11123. PREFETCH

    + +

    This standard refers to the following concepts defined there:

    + + + +

    The `Speculation-Rules` header

    + +

    The `Speculation-Rules` HTTP response header allows the + developer to request that the user agent fetch and apply a given speculation rule set + to the current Document. It is a structured + header whose value must be a list of + strings that are all valid URL strings.

    + +

    To process the `Speculation-Rules` header given a Document + document and a response response:

    + +
      +
    1. Let parsedList be the result of getting a structured field value + given `Speculation-Rules` and "list" from + response's header list.

    2. + +
    3. If parsedList is null, then return.

    4. + +
    5. +

      For each item of parsedList:

      + +
        +
      1. If item is not a string, then continue.

      2. + +
      3. Let url be the result of URL parsing + item with document's document base URL.

      4. + +
      5. If url is failure, then continue.

      6. + +
      7. +

        In parallel:

        + +
          +
        1. +

          Optionally, wait for an implementation-defined amount of time.

          + +

          This allows the implementation to prioritize other work ahead of loading + speculation rules, as especially during Document creation and header + processing, there are often many more important things going on.

          +
        2. + +
        3. +

          Queue a global task on the speculation rules task source given + document's relevant global object to perform the following + steps:

          + +
            +
          1. Let request be a new request whose + URL is url, destination is "speculationrules", and mode is + "cors".

          2. + +
          3. +

            Fetch request with the following processResponseConsumeBody steps given response response and null, failure, or a + byte sequence bodyBytes:

            + +
              +
            1. If bodyBytes is null or failure, then abort these steps.

            2. + +
            3. If response's status is + not an ok status, then abort these steps.

            4. + +
            5. If the result of extracting a MIME type + from response's header list + does not have an essence of + "application/speculationrules+json", then abort these steps.

            6. + +
            7. Let bodyText be the result of UTF-8 + decoding bodyBytes.

            8. + +
            9. Let ruleSet be the result of parsing a speculation rule set string given bodyText, + document, and response's URL. If this throws an exception, then abort these + steps.

            10. + +
            11. Append ruleSet to + document's speculation rule + sets.

            12. + +
            13. Consider speculative loads for document.

            14. +
            +
          4. +
          +
        4. +
        +
      8. +
      +
    6. +
    + + +

    The `Sec-Speculation-Tags` header

    + +

    The `Sec-Speculation-Tags` HTTP request header specifies + the web developer-provided tags associated with the speculative navigation request. It can also be + used to distinguish speculative navigation requests from speculative subresource requests, since + `Sec-Purpose` can be sent by both categories of requests.

    + +

    The header is a structured header whose value must + be a list. The list can contain either token or string values. String values represent + developer-provided tags, whereas token values represent predefined tags. As of now, the only + predefined tag is null, which indicates a speculative navigation request + with no developer-defined tag.

    + + +

    Security considerations

    + + +
    Cross-site requests
    + +

    Speculative loads can be initiated by web pages to cross-site destinations. However, because + such cross-site speculative loads are always done without credentials, as explained + below, ambient authority is limited to + requests that are already possible via other mechanisms on the platform.

    + +

    The `Speculation-Rules` header can also be used to issue requests, for JSON + documents whose body will be parsed as a + speculation rule set string. However, they use the "same-origin" + credentials mode, the "cors" mode, and responses which do not + use the application/speculationrules+json MIME type essence are ignored, + so they are not useful in mounting attacks.

    + + +
    Injected content
    + +

    Because links in a document can be selected for speculative loading via document rule predicates, developers need to be cautious if such links + might contain user-generated markup. For example, if the href of a link can be entered by one user and displayed to all + other users, a malicious user might choose a value like "/logout", causing + other users' browsers to automatically log out of the site when that link is speculatively loaded. + Using a document rule selector predicate to exclude such potentially-dangerous links, + or using a document rule URL pattern predicate to allowlist known-safe links, are + useful techniques in this regard.

    + +

    As with all uses of the script element, developers need to be cautious about + inserting user-provided content into <script type=speculationrules>'s + child text content. In particular, the insertion of an unescaped closing </script> tag could be used to break out of the script element + context and inject attacker-controlled markup.

    + +

    The <script type=speculationrules> feature causes activity in + response to content found in the document, so it is worth considering the options open to an + attacker able to inject unescaped HTML. Such an attacker is already able to inject JavaScript or + iframe elements. Speculative loads are generally less dangerous than arbitrary script + execution. However, the use of document rule + predicates could be used to speculatively load links in the document, and the existence of + those loads could provide a vector for exfiltrating information about those links. + Defense-in-depth against this possibility is provided by Content Security Policy. In particular, + the script-src directive can be used to restrict the parsing of speculation + rules script elements, and the default-src directive applies + to navigational prefetch requests arising from such speculation rules. Additional defense is + provided by the requirement that speculative loads are only performed to potentially-trustworthy URLs, so an on-path attacker would only + have access to metadata and traffic analysis, and could not see the URLs directly. + CSP

    + +

    It's generally not expected that user-generated content will be added as arbitrary response + headers: server operators are already going to encounter significant trouble if this is possible. + It is therefore unlikely that the `Speculation-Rules` header meaningfully expands the + XSS attack surface. For this reason, Content Security Policy does not apply to the loading of rule + sets via that header.

    + + +
    IP anonymization
    + +

    This standard allows developers to request that navigational prefetches are performed using IP + anonymization technology provided by the user agent. The details of this anonymization are not + specified, but some general security principles apply.

    + +

    To the extent IP anonymization is implemented using a proxy service, it is advisable to + minimize the information available to the service operator and other entities on the network path. + This likely involves, at a minimum, the use of TLS for the connection.

    + +

    Site operators need to be aware that, similar to virtual private network (VPN) technology, the + client IP address seen by the HTTP server might not exactly correspond to the user's actual + network provider or location, and a traffic for multiple distinct subscribers could originate from + a single client IP address. This can affect site operators' security and abuse prevention + measures. IP anonymization measures might make an effort to use an egress IP address which has a + similar geolocation or is located in the same jurisdiction as the user, but any such behavior is + particular to the user agent and not guaranteed.

    + + +

    Privacy considerations

    + + +
    Heuristics and optionality
    + +

    The consider speculative loads algorithm contains a crucial "may" step, which + encourages user agents to start + referrer-initiated navigational prefetches based on a combination of the speculation + rule eagerness and other features of the user's environment. Because it can be observable + to the document whether speculative loads are performed, user agents must take care to protect + privacy when making such decisions—for instance by only using information which is already + available to the origin. If these heuristics depend on any persistent state, that state must be + erased whenever the user erases other site data. If the user agent automatically clears other site + data from time to time, it must erase such persistent state at the same time.

    + +

    The use of origin instead of site here is intentional. + Although same-site origins are generally allowed to coordinate if they wish, the web's security + model is premised on preventing origins from accessing the data of other origins, even same-site + ones. Thus, the user agent needs to be sure not to leak such data unintentionally across origins, + not just across sites.

    + +

    Examples of inputs which would be already known to the document:

    + + + +

    Examples of persistent data related to the origin (which the origin could have gathered itself) + but which must be erased according to user intent:

    + + + +

    Examples of device information which might be valuable in deciding whether speculative loading + is appropriate, but which needs to be considered as part of the user agent's overall privacy + posture because it can make the user more identifiable across origins:

    + + + + +
    State partitioning
    + +

    The start a referrer-initiated navigational prefetch algorithm is designed to + ensure that the HTTP requests that it issues behave consistently with how user agents partition + credentials according to storage keys. This + property is maintained even for cross-partition prefetches, as follows.

    + +

    If a future navigation using a prefetched response would load a document in the same partition, + then at prefetch time, the partitioned credentials can be sent, as they can with subresource + requests and scripted fetches. If such a future navigation would instead load a document in + another partition, it would be inconsistent with the partitioning scheme to use partitioned + credentials for the destination partition (since this would cross the boundary between partitions + without a top-level navigation) and also inconsistent to use partitioned credentials within the + originating partition (since this would result in the user seeing a document with different state + than a non-prefetched navigation). Instead, a third, initially empty, partition is used for such + requests. These requests therefore send along no credentials from either partition. However, the + resulting prefetched response body constructed using this initially-empty partition can only be + used if, at activation time, the destination partition contains no credentials.

    + +

    This is somewhat similar to the behavior of only sending such prefetch requests if the + destination partition is known ahead of time to not contain credentials. However, to avoid such + behavior being used a way of probing for the presence of credentials, instead such prefetch + requests are always completed, and in the case of conflicting credentials, their results are not + used.

    + +

    Redirects are possible between these two types of requests. A redirect from a same- to + cross-partition URL could contain information derived from partitioned credentials in the + originating partition; however, this is equivalent to the originating document fetching the + same-partition URL itself and then issuing a request for the cross-partition URL. A redirect from + a cross- to same-origin URL could carry credentials from the isolated partition, but since this + partition has no prior state this does not enable tracking based on the user's prior browsing + activity on that site, and the document could construct the same state by issuing uncredentialed + requests itself.

    + + +
    Identity joining
    + +

    Speculative loads provide a mechanism through which HTTP requests for later top-level + navigation can be made without a user gesture. It is natural to ask whether it is possible for two + coordinating sites to connect user identities.

    + +

    Since existing credentials for the destination site are not sent (as explained in + the previous section), that site is limited in its ability to identify the user before navigation + in a similar way to if the referrer site had simply used fetch() to make an + uncredentialed request. Upon navigation, this becomes similar to ordinary navigation (e.g., by + clicking a link that was not speculatively loaded).

    + +

    To the extent that user agents attempt to mitigate identity joining for ordinary fetches and + navigations, they can apply similar mitigations to speculatively-loaded navigations.

    + +

    The `X-Frame-Options` header

    The `X-Frame-Options` HTTP response header is a way @@ -111787,6 +113966,115 @@ dictionary PromiseRejectionEventInit : EventInitimport map.

    + +
    Speculation rules parse results
    + +

    A speculation rules parse result is a struct that is similar to a script, and also can be stored in a script element's + result, but is not counted as a script for other purposes. It has the following items:

    + +
    +
    A speculation rule set
    +
    A speculation rule set or null.
    + +
    An error to rethrow
    +
    A JavaScript value representing an error that will prevent using these speculation rules, + when non-null.
    +
    + +

    To create a speculation rules parse result given a string + input and a Document document:

    + +
      +
    1. Let result be a speculation rules parse result whose import map is null and whose error to rethrow is null.

    2. + +
    3. Parse a speculation rule set string given input, + document, and document's document base URL, catching any + exceptions. If this threw an exception, then set result's error to rethrow to that exception. Otherwise, set + result's speculation rule set to the return + value.

    4. + +
    5. Return result.

    6. +
    + +

    To register speculation rules given a Window global, a + speculation rules parse result result, and an optional boolean + queueErrors (default false):

    + +
      +
    1. +

      If result's error to rethrow is not + null, then:

      + +
        +
      1. +

        If queueErrors is true, then queue a global task on the DOM + manipulation task source given global to perform the following step:

        + +
          +
        1. Report an exception given by result's error to rethrow for global.

        2. +
        +
      2. + +
      3. Otherwise, report an exception given by result's error to rethrow for global.

      4. + +
      5. Return.

      6. +
      +
    2. + +
    3. Append result's speculation rule set to global's associated Document's speculation rule sets.

    4. + +
    5. Consider speculative loads for document.

    6. +
    + +

    To unregister speculation rules given a Window global and a + speculation rules parse result result:

    + +
      +
    1. If result's error to rethrow is not + null, then return.

    2. + +
    3. Remove result's speculation rule set from global's associated Document's speculation rule sets.

    4. + +
    5. Consider speculative loads for document.

    6. +
    + +

    To update speculation rules given a Window global, a + speculation rules parse result oldResult, and a speculation rules + parse result newResult:

    + +
      +
    1. Remove oldResult's speculation rule set from global's associated Document's speculation rule sets.

    2. + +
    3. +

      Register speculation rules given global, newResult, and + true.

      + +

      When updating speculation rules, + as opposed to registering them for the first time, we ensure that any error events are queued as tasks, instead of synchronously fired. + Although synchronously executing error event handlers is OK + when inserting script elements, it's best if other modifications do not cause such + synchronous script execution.

      +
    4. +
    + @@ -142539,6 +144827,69 @@ interface External { application/json (namely, at the time of writing, no semantics at all). JSON

    +

    application/speculationrules+json

    + +

    This registration is for community review and will be submitted to the IESG for review, + approval, and registration with IANA.

    + + + +
    +
    Type name:
    +
    application
    +
    Subtype name:
    +
    microdata+json
    +
    Required parameters:
    +
    Same as for application/json JSON
    +
    Optional parameters:
    +
    Same as for application/json JSON
    +
    Encoding considerations:
    +
    8bit (always UTF-8)
    +
    Security considerations:
    +
    Same as for application/json JSON
    +
    Interoperability considerations:
    +
    Same as for application/json JSON
    +
    Published specification:
    +
    + Labeling a resource with the application/microdata+json type asserts that the + resource is a JSON text that follows the speculation rule set authoring requirements. Thus, the relevant specifications are + JSON and this specification. JSON +
    +
    Applications that use this media type:
    +
    +

    Web browsers.

    +
    +
    Additional information:
    +
    +
    +
    Magic number(s):
    +
    Same as for application/json JSON
    +
    File extension(s):
    +
    Same as for application/json JSON
    +
    Macintosh file type code(s):
    +
    Same as for application/json JSON
    +
    +
    +
    Person & email address to contact for further information:
    +
    Domenic Denicola <d@domenic.me>
    +
    Intended usage:
    +
    Common
    +
    Restrictions on usage:
    +
    No restrictions apply.
    +
    Author:
    +
    Domenic Denicola <d@domenic.me>
    +
    Change controller:
    +
    WHATWG
    +
    + +

    Fragments used with + application/speculationrules+json resources have the same semantics as when used with + application/json (namely, at the time of writing, no semantics at all). + JSON

    +

    text/event-stream

    @@ -145864,7 +148215,7 @@ interface External { type script Type of script - "module"; a valid MIME type string that is not a JavaScript MIME type essence match + "module"; "importmap"; "speculationrules"; a valid MIME type string that is not a JavaScript MIME type essence match usemap img @@ -147790,6 +150141,9 @@ INSERT INTERFACES HERE
    [NAVIGATIONTIMING]
    Navigation Timing, Y. Weiss. W3C.
    +
    [NOVARYSEARCH]
    +
    The No-Vary-Search HTTP Response Header Field, D. Denicola, J. Roman. IETF.
    +
    [NPAPI]
    (Non-normative) Gecko Plugin API Reference. Mozilla.
    @@ -147835,6 +150189,12 @@ INSERT INTERFACES HERE
    [PPUTF8]
    (Non-normative) The Properties and Promises of UTF-8, M. Dürst. University of Zürich. In Proceedings of the 11th International Unicode Conference.
    +
    [PREFETCH]
    +
    Prefetch, J. Roman. WICG.
    + +
    [PRERENDERING-REVAMPED]
    +
    (Non-normative) Prerendering Revamped, D. Denicola, D. Farolino. W3C.
    +
    [PRESENTATION]
    Presentation API, M. Foltz, D. Röttsches. W3C.
    @@ -147972,6 +150332,9 @@ INSERT INTERFACES HERE
    [URL]
    URL, A. van Kesteren. WHATWG.
    +
    [URLPATTERN]
    +
    URL Pattern, B. Kelly, J. Roman, 宍戸俊哉. WHATWG.
    +
    [URN]
    URN Syntax, R. Moats. IETF.
    @@ -149291,6 +151654,10 @@ INSERT INTERFACES HERE the worklets. In particular, thanks to Ian Kilpatrick for his work as editor of the original worklets specification.

    +

    Special thanks to the WICG for incubating the speculative loading feature. In particular, thanks to Jeremy Roman + for his work as editor of the original speculation rules and prefetch specifications.

    +

    For about ten years starting in 2003, this standard was almost entirely written by Ian Hickson (Google, ian@hixie.ch).

    @@ -149418,6 +151785,13 @@ INSERT INTERFACES HERE href="https://www.w3.org/Consortium/Legal/2015/copyright-software-and-document">W3C Software and Document License.

    +

    Part of the revision history of the Speculative loading section can be + found in the WICG/nav-speculation + repository, which is available under the W3C Software and + Document License.

    +

    Copyright © WHATWG (Apple, Google, Mozilla, Microsoft). This work is licensed under a Creative Commons Attribution 4.0 International License. To the extent portions of it are incorporated into source code, such