Its rare to find devs anticipating Unicode control characters showing up in user input. And the most fun when unanticipated is the Right-To-Left Override character U+202E. Unicode characters have an implicit direction so that for example by default Hebrew characters are rendered from right to left, and English characters are rendered left to right. The override characters force an explicit direction for all the text that follows.
I chose my Twitter display name to include the HTML encoding of the Right-To-Left Override character #x202E;
as a sort of joke or shout out to my favorite Unicode control character.
I did not anticipate that some Twitter clients in some of their UI would fail to encode it correctly. There's no way I can remove that from my display name now.
Try it on Amazon.
Uhh has anyone notice Garry Marshall’s Wikipedia page?
Hahaha
Wiki user ‘Gillian Marshal’ (http://en.wikipedia.org/w/index.php?title=Garry_Marshall&diff=prev&oldid=596787114) updated his page yesterday. Nice and subtle only editing the summary section on the right.
My second completed app for the Windows Store was Words with Hints a companion to Words with Friends or other Scrabble like games that gives you *ahem* hints. You provide your tiles and optionally letters placed in a line on the board and Words with Hints gives you word options.
I wrote this the first time by building a regular expression to check against my dictionary of words which made for a slow app on the Surface. In subsequent release of the app I now spawn four web workers (one for each of the Surface's cores) each with its own fourth of my dictionary. Each fourth of the dictionary is a trie which makes it easy for me to discard whole chunks of possible combinations of Scrabble letters as I walk the tree of possibilities.
The dictionaries are large and takes a noticeable amount of time to load on the Surface. The best performing mechanism I found to load them is as JavaScript source files that simply define their portion of the dictionary on the global object and synchronously (only on the worker so not blocking the UI thread). Putting them into .js files means they take advantage of bytecode caching making them load faster. However because the data is mostly strings and not code there is a dramatic size increase when the app is installed. The total size of the four dictionary .js files is about 44Mb. The bytecode cache for the dictionary files is about double that 88Mb meaning the dictionary plus the bytecode cache is 132Mb.
To handle the bother of postMessage communication and web workers this was the first app in which I used my promise MessagePort project which I'll discuss more in the future.
This is the first app in which I used the Microsoft Ad SDK. It was difficult to find the install for the SDK and difficult to use their website, but once setup, the Ad SDK was easy to import into VS and easy to use in my app.
Level 7 of the Stripe CTF involved running a length extension attack on the level 7 server's custom crypto code.
@app.route('/logs/')
@require_authentication
def logs(id):
rows = get_logs(id)
return render_template('logs.html', logs=rows)
...
def verify_signature(user_id, sig, raw_params):
# get secret token for user_id
try:
row = g.db.select_one('users', {'id': user_id})
except db.NotFound:
raise BadSignature('no such user_id')
secret = str(row['secret'])
h = hashlib.sha1()
h.update(secret + raw_params)
print 'computed signature', h.hexdigest(), 'for body', repr(raw_params)
if h.hexdigest() != sig:
raise BadSignature('signature does not match')
return True
The level 7 web app is a web API in which clients submit signed RESTful requests and some actions are restricted to particular clients. The goal is to view the response to one of the restricted actions. The first issue is that there is a logs path to display the previous requests for a user and although the logs path requires the client to be authenticatd, it doesn't restrict the logs you view to be for the user for which you are authenticated. So you can manually change the number in the '/logs/[#]' to '/logs/1' to view the logs for the user ID 1 who can make restricted requests. The level 7 web app can be exploited with replay attacks but you won't find in the logs any of the restricted requests we need to run for our goal. And we can't just modify the requests because they are signed.
However they are signed using their own custom signing code which can be exploited by a length extension attack. All Merkle–Damgård hash algorithms (which includes MD5, and SHA) have the property that if you hash data of the form (secret + data) where data is known and the length but not content of secret is known you can construct the hash for a new message (secret + data + padding + newdata) where newdata is whatever you like and padding is determined using newdata, data, and the length of secret. You can find a sha-padding.py script on VNSecurity blog that will tell you the new hash and padding per the above. With that I produced my new restricted request based on another user's previous request. The original request was the following.
count=10&lat=37.351&user_id=1&long=%2D119.827&waffle=eggo|sig:8dbd9dfa60ef3964b1ee0785a68760af8658048c
The new request with padding and my new content was the
following.
count=10&lat=37.351&user_id=1&long=%2D119.827&waffle=eggo%80%02%28&waffle=liege|sig:8dbd9dfa60ef3964b1ee0785a68760af8658048c
My new data in the new
request is able to overwrite the waffle parameter because their parser fills in a map without checking if the parameter existed previously.
Code review red flags included custom crypto looking code. However I am not a crypto expert and it was difficult for me to find the solution to this level.
Winterton, a senior entomologist at the California Department of Food and Agriculture, has seen a lot of bugs. But he hadn’t seen this species before.
There’s no off switch when you’re the senior entomologist. If you’re browsing the web you find your way to Flickr photos of insects or start correcting Wikipedia articles on insects.
PPACA (aka Obamacare) broken down into its main subsections with brief explinations and citations linking into the actual PPACA document (why is it always PDF?).
Its interesting to see the very small number of parts folks are complaining about versus the rest which mostly strikes me as “how could this not already be the case?”
I’m no expert, and everything I posted here I attribute mostly to Wikipedia or the actual bill itself, with an occasional Google search to clarify stuff. I am absolutely not a difinitive source or expert. I was just trying to simplify things as best I can without dumbing them down. I’m glad that many of you found this helpful.”
HTTP Content Coding Token | gzip | deflate | compress |
---|---|---|---|
An encoding format produced by the file compression program "gzip" (GNU zip) | The "zlib" format as described in RFC 1950. | The encoding format produced by the common UNIX file compression program "compress". | |
Data Format | GZIP file format | ZLIB Compressed Data Format | The compress program's file format |
Compression Method | Deflate compression method | LZW | |
Deflate consists of LZ77 and Huffman coding |
Compress doesn't seem to be supported by popular current browsers, possibly due to its past with patents.
Deflate isn't done correctly all the time. Some servers would send the deflate data format instead of the zlib data format and at least some versions of Internet Explorer expect deflate data format instead of zlib data format.
Wikipedia’s alternate names for the slash include stroke, solidus, and separatrix.
Answers those questions like “When will the Sun boil away the Earth’s oceans?” and “When will the Sun burn out?”, but brings up new questions like which supercontinent configuration will win? I’m hoping for Pangea Ultima as it has the best name.
Interesting article on an expert attempting to modify an article on Wikipedia. Sounds like an issue when presented in this fashion, but looking at it from Wikipedia’s perspective, I don’t know how they could do better.