Changes

Jump to: navigation, search

Cross Site XMLHttpRequest

1,680 bytes added, 09:10, 28 February 2007
Security worries
* The first thing that worries me is that you can make POST submissions to any url and include XML data as payload. It is already possible to make POST submissions to any url, but the only possible payload is plain/text encoded form data or multipart/mixed encoded files and form data. With Cross-Site XMLHttpRequest it would be possible to send XML data. In particular there is worry that this would make it possible to do SOAP requests to any server. Note that while the page would be unable to access the data returned by the SOAP request, that isn't necessary if the request itself is "transfer all users money to account 12345-67". To avoid this we could either use the model as for non-GET-non-POST requests defined in the XHR spec [http://lists.w3.org/Archives/Public/public-webapi/2006Jun/0012], or we could use something like [http://lxr.mozilla.org/mozilla/source/extensions/webservices/docs/New_Security_Model.html]
** It is already possible to POST arbitrary data using <form encoding="text/plain">
** Should still investigate if this will mess up SOAP servers
** Using magic url is bad if you want different policies for different files in the same directory.
** Caching should prevent excessive extra GETs even in the case of POST
* Should we try to follow these specs even when accessing files on the same domain? From the sites point of view they can't rely on that anyway since all browsers don't support the access-control spec (and old versions never will).
** No. It'll just trick developers into thinking they are protected against things they really aren't.
* We have to make sure to not notify the onreadystatechange listener or any other listeners until we've done all access control checks. Otherwise it would be possible to check for the availability of files on other servers though you couldn't actually read the content.
** Should be taken care of by the inner nsIStreamListener approach
* We have to make sure to not put data in .responseText until we've passed access control checks even for XML files.
** Should be taken care of by the inner nsIStreamListener approach
* We have to make it impossible to distinguish between a access-control-failed error and network errors such as 404s. Can the implementation "recancel" a canceled channel?
** Might be possible to recancel, have to check implementations.
** Alternative might be to make sure that clients of the new code doesn't use the errorcode on the channel but rather the one passed in from onStartRequest/onStopRequest
* Should we check for PIs even if HTTP headers has said that access is granted? It'll always be possible to circumvent those headers using .mimetypeOverride which'll make us not treat the doc as XML and thus we won't even look for PIs. Alternatively we could ignore the .mimetypeOverride when checking for PIs but that might be a problem with poorly configured servers (which is the whole reason for .mimetypeOverride)
** Do not pay attention to .mimetypeOverride when checking for PIs
** If headers grant access do check for PIs
** If headers denies access don't check for PIs
*** This is so that it's easy to deny access everywhere on a server level
* We should make sure to make it impossible to set authentication headers since that would make it easier for a site to attempt (distributed) brute force hacking against authenticated servers. Note though that such hacking would be significantly complicated by the fact that the server must be password protected but still have files that it grants access to a 3rd party server, which doesn't really make a lot of sense.
** Also disallow passing login arguments to .open()
* Timeless left some comments at [http://docs.google.com/Doc?id=dhmd4jxt_27ggbhc8]
 
* Should we send authentication information with the first GET request for the case when we do two requests? Should we send cookies? Alternative is to prefix authentication and cookie headers with 'xmlhttprequest-' or similar to avoid affecting existing servers but allow aware servers to look at relevant headers.
** We might as well include authentication headers and cookies in the original GET since that request can be done by any thirdparty anyway.
** Not including the authentication header makes it harder on CGIs since the webserver might deny access before the CGI even gets a chance to react.
* Do NOT send custom headers or cookies when talking to external sites -- this risks exposing sensitive IDs, usernames, and passwords when talking to third party services.
* I don't see an adequate threat model described here -- what are the kinds of activities that a potential attacker might use this channel to do, and what are some ways to prevent this? For example, how will cross site XHR be used in conjunction with cross site scripting attacks?
Confirm
716
edits

Navigation menu