"Do you mean that you'll compare the network requests made from the main and the shadow pages?"
Essentially, yes. Requests from the extension should be treated as if
they are of an origin different from the page. (We could potentially
piggy-back on existing notions of security principals (e.g., that
Firefox has) to avoid huge performance hits.) And if the extension is
tainted the kinds of requests will be restricted according to the
taint (as in COWL [1], likely using CSP for the underlying
enforcement).
"What if the main script is sensitive to the request receive time? Then the shadow DOM may act differently."
If by main script you mean a script on the page, then there should be no real difference.
"From more practical standpoint, having two DOMs for every page will eat even more of my laptop's RAM."
I hope this won't be so bad down the line (assuming we'd be able to
leverage some underlying shadow DOM infrastructure and that performs
relatively well).
Essentially, yes. Requests from the extension should be treated as if they are of an origin different from the page. (We could potentially piggy-back on existing notions of security principals (e.g., that Firefox has) to avoid huge performance hits.) And if the extension is tainted the kinds of requests will be restricted according to the taint (as in COWL [1], likely using CSP for the underlying enforcement).
"What if the main script is sensitive to the request receive time? Then the shadow DOM may act differently."
If by main script you mean a script on the page, then there should be no real difference.
"From more practical standpoint, having two DOMs for every page will eat even more of my laptop's RAM."
I hope this won't be so bad down the line (assuming we'd be able to leverage some underlying shadow DOM infrastructure and that performs relatively well).
[1] http://cowl.ws