2008 Feb 18, 3:05A case study on the origins of a humorous mistranslation. FTA: "The really weird ones are apparently from dictionary look-up errors ... not just taking an unlikely choice from the correct entry, but
actually reading a different (but nearby) entry."
humor language blog article translate mistranslation languagelog 2008 Feb 3, 11:01FTA: "Like Klein, EFF senior staff attorney Kevin Bankston had spent much of the previous December reading press accounts of the administration's secret surveillance program. "It was all I thought
about over the holiday," he remembers. In fact, at his bos
via:boingboing eff article law privacy government history 2008 Jan 14, 10:16Stephen Toub implements closed captioning searching of videos recorded with Windows Media Center through Windows Desktop Search as an IFilter. I wanted to do the same thing after reading the related
Ars Technica article. Other interesting things in the
.net mce programming reference video caption dvr-ms howto ifilter development com software microsoft msdn blog article 2007 Dec 24, 12:41These days it seems like there's a social sharing website for everything representable as bits. Like
Scribd for (mostly legal) documents,
SciVee for scientific research videos,
Wordie for words, and
Kuler for color themes. Kuler seems
like a ridiculous website (overkill) but I had been meaning to update my homepage's color design and Kuler has an
RSS based REST API.
The API lets you obtain things like the most recently added color themes or the most popular or all themes containing the color dark red, etc... So of course rather than update my website's design I
hooked up my css to the color themes coming out of Kuler. Select my main page's color theme from a
list of random Kuler themes. As I'm sure
the regular readers can guess I use
an xslt and blah blah blah... It looks OK with
Silver Surfer and
Happy Hipo but in general
changing the colors this way doesn't produce something pretty.
When reading about Kuler I found that they may have stolen the whole idea wholeslae from
ColourLovers. They discuss
the thievery in an article on their blog. I would have switched over to ColourLovers out of principle but
they don't have an easily accessible API.
colourlovers color xslt theme homepage technical kuler design 2007 Nov 28, 1:23One of the new Zune features that had me the most excited was the claimed improved Windows Media Center integration which unfortunately turned out to simply mean support for the Win MCE video format
(
with an exception for HD). I wanted to be able to pick shows recorded by my Win MCE and have the Zune automatically sync up the
latest episodes. However, with the improved podcast support in the Zune software one can easily create a ridiculous hack to accomplish this.
The new Zune software has podcast support which does everything I'd want to do with a
Win MCE recorded TV series so the goal is to shoehorn a TV series into a Zune podcast. An overview of the steps: Create an XSLT that converts Win MCE data to a podcast, run the XSLT as a scheduled
task every few hours per TV series, setup a Web server pointed at the resulting podcasts and the Win MCE Recorded TV directory, and subscribe to the resulting podcasts in the Zune software.
- Reading through the Win MCE data stored as an XML file in "C:\ProgramData\Microsoft\eHome\Recording\Recordings.xml" and the spec for podcasts I created an XSLT to convert a series from Win MCE data to a podcast.
- I added a new task to the Scheduled Tasks to run my XSLT using my xsltproc.js script. The task runs a handful of commands that look something like the following:
C:\windows\system32\wscript.exe C:\users\dave\bin\xsltproc.js C:\Users\Dave\Documents\trunk\development\mce-zune\mce-to-podcast.xslt
C:\ProgramData\Microsoft\eHome\Recording\Recordings.xml --param title "The Daily Show With Jon Stewart" --param max 4 --param baseURI "http://groucho/" --param thisRelURI "tds.xml" -o
"D:\recorded tv\tds.xml"
For each TV series I run a command like the above and that outputs a podcast for that series into my "D:\Recorded TV\" directory.
- Zune only allows http URIs for its podcasts so I installed a web server on my Win MCE server. I'm running Vista Ultimate so it was quick and easy for me to install IIS7 but any Web server will do. Then I pointed it at "D:\Recorded TV\".
- Once all the above was done I just subscribed to the resulting podcasts via my Web server and viola! Since I'm forced to use a Web server I can even run the Zune software on a machine other
than my Win MCE server. You can see a screen-shot above of my Zune software showing my Colbert Report podcast.
technical xml mce hack windows media center zune windows xslt podcast 2007 Nov 6, 2:46Video of TED lectures. TED is (from Wikipedia) "... an annual conference held in Monterey, California and recently, semi-annually in other cities around the world. TED describes itself as a "group of
remarkable people that gather to exchange ideas of inc
analysis blog video visualization internet social technology ted business news ideas conference 2007 Oct 20, 3:07Bill Hill's blog on reading and the Internet
bill-hill reading blog microsoft internet 2007 Sep 12, 6:54I'm visiting
Wikipedia more and more recently but I always find myself reading the referenced webpages to get the full context of quotes and for
more info. Basically I use Wikipedia as an introduction and a place to look for links. For times when I'm looking for opinions rather than facts I like to use
Everything2. No need to check references there.
There's the much hyped
WikiScanner tool which reports who has been making anonymous (thought to be anonymous at the time anyway) edits to
Wikipedia. Its humorous and interesting in a few cases, but in general I think its stretching to say that because an IP address range is owned by a corporation and someone edited Wikipedia on an IP
in that range that you can attribute that edit to that corporation. If I edited Wikipedia I'd probably do a bit of that during my lunch break, but that wouldn't mean that Microsoft wants the
Wikipedia pages for Weird Al, Dave Risney, URIs, or whatever else I would edit on Wikipedia changed.
Also, via
Everything Is Miscellaneous I found the tool
Wiki Dashboard. Wiki Dashboard proxies
Wikipedia and on each page shows a timeline view at the top with who made edits and when. Its nice to see a gentle curve down from an initial spike at the beginning for topics you don't imagine to be
controversial. As the canonical test page for this service I looked up 'Elephant' the
Wikipedia page Stephen Colbert
suggested folks vandalize on his show on 2006 July 31st. If you look at the
Wiki Dashboard Elephant page you can see a very large spike
in edits on that date. That's all I need to see.
As a side note, for the link on Stephen Colbert suggesting folks vandalize Wikipedia I linked to a Wikipedia article. Is it inappropriate to provide info about Wikipedia being vandalized and thus
incorrect via a link to a Wikipedia article?
wikidashboard stephen-colbert wikality wikipedia wikiscanner colbert-report 2007 Aug 13, 3:35
|
I've been told that family members after reading my webpage which contains some technical related material would turn to my cousins webpage. So, in an effort to not drive away
readers I've...
|
Views: 328
3 ratings
|
Time: 00:08
|
More in Pets & Animals
|
video 2007 Jul 4, 10:58Hackdiary
I really enjoy reading Matt Biddulph's blog
hackdiary. An entry some time ago talked about his
Second
Life flickr screen which is a screen in Second Life that displays images from flickr.com based on viewers suggested tags. I'm a novice to the Second Life scripting API and so it was from this
blog post I became aware of the
llHTTPRequest. This is like the XMLHttpRequest for Second Life code in that it lets you make HTTP requests.
I decided that I too could do something cool with this.
Translator
I decided to make a translator object that a Second Life user would wear that would translate anything said near them. The details aren't too surprising: The translator object keeps an owner
modifiable list of translation instructions each consisting of who to listen to, the language they speak, who to tell the translation to, and into what language to translate. When the translator
hears someone, it runs through its list of translation instructions and when it finds a match for the speaker uses the llHTTPRequest to send off what was said to
Google translate. When the result comes back the translator simply says the response.
Issues
Unfortunately, the llHTTPRequest limits the response size to 2K and no translation site I can find has the translated text in the first 2K. There's a flag HTTP_BODY_MAXLENGTH provided but it defaults
to 2K and you can't change its value. So I decided to setup a PHP script on my site to act as a translating proxy and parse the translated text out of the HTML response from Google translate. Through
experimentation I found that their site can take parameters text and langpair queries in the query like so:
http://translate.google.com/translate_t?text=car%20moi%20m%C3%AAme%20j%27en%20rit&langpair=fr|en
. On the topic of non US-ASCII characters (which is important for a translator) I
found that llHTTPRequest encodes non US-ASCII characters as percent-encoded UTF-8 when constructing the request URI. However, when Google translate takes parameters off the URI it only seems to
interpret it as percent-encoded UTF-8 when the user-agent is IE's. So after changing my
PHP script to use IE7's user-agent non
US-ASCII character input worked.
In Use
Actually using it in practice is rather difficult. Between typos, slang, abbreviations, and the current state of the free online translators its very difficult to carry on a conversation.
Additionally, I don't really like talking to random people on Second Life anyway. So... not too useful.
personal translate second-life technical translator sl code google php llhttprequest 2007 Apr 11, 12:58An old ars technica article about getting the most out of your nano-compiler (April Fools article). I remember reading this in high school.
humor nano nanocompiler fabrication article 2007 Mar 19, 1:03Help ensure that projects to turn books into text files are correct by proofreading the results.
books book gutenberg literature internet volunteer free 2007 Mar 13, 7:57I had a few thoughts after reading about
OpenID. However, after doing only a very small amount of digging I can see these aren't new thoughts.
-
Anonymous OpenID
-
Have an OpenID that anyone can use because it performs no authorization. You'd specify a URI like http://deletethis.net/anonymousopenid/yournamehere and you'd immediately get an anonymous OpenID
associated with that URI. This has already been implemented by Jayant Gandhi.
-
Group OpenID
-
Have an OpenID that consists of a group of member OpenIDs. To login as the Group OpenID you need to login with any of the member OpenIDs. This is discussed more by Dmitry Shechtman on his blog.
-
OpenID Normalization
-
I find that I already have a couple of OpenIDs without even trying due to AOL giving out OpenIDs. I'd like for all of my
OpenIDs to point to one canonical OpenID. It looks like this may already be possible by the OpenID
specification.
I guess I'm a little late to the scene.
technical stolen-thoughts openid 2007 Feb 25, 11:58Shirt with a picture of Gaius Baltar (from Battlestar Galactica) reading "Not My President"
humor shirt merch baltar bsg battlestar politics neat-fp