use HTML code in Wikimedia

internet malware technology

since it is pretty tricky to google for, here the wonderful scary as hell wikimedia addition that lets you add raw html code in your pages:
[make sure to read the end of this post]


< ?php # RawHtml.php - raw HTML extension # # Defines the tag pair .
# Sends the content out without any processing.
#
# To use, include this file into your LocalSettings.php
# To configure, set members of $wgRawHtml after the inclusion.
#
# include 'RawHtml.php';
#
# $wgRawHtml = array('JoeUser', 'JoeUserBot')
#
# Adapted from code by Jan Steinman

class raw_html_settings { };

$wgRawHtml = new raw_html_settings;
$wgExtensionFunctions[] = 'wf_raw_html_ext';

function wf_raw_html_ext() {

global $wgParser;

$wgParser->setHook('RawHtml', 'render_raw_html');
}

function render_raw_html($raw_html_src, $style='') {

return $raw_html_src;
}
?>

found here.

It really is easy to use: Just add the file as RawHtml.php and then add in the end of LocalSettings.php the following lines:


include 'RawHtml.php';
$wgRawHtml = array('user-name-to-use-this-goes-here' , 'this-would-be-a-second-one');

It turns out that the user names get absolutely ignored. So actually this is really dangerous to do, since ANYBODY that can edit the wiki can also insert any html code. Which is ok in a non public wiki, but NOT out there on those internets.

So the code above is plain malware: A bot could crawl the sources of wikis and could insert any html that might please in those pages. Allot of harm can be done that way.

For a decent explaination how to add your own addition look here

I ended up boiling up a couple of probably horrible php lines myself:


?php
#mimg.php
#insert image in wikimedia pages.
#to use add code like:

#/path/to/image.png

#please note that I have no freaking clue what I am doing.

#this will only work with local links to images, since all
#characters apart from numbers, letters slash and dot will be filtered when rendered
#to install save this in a file and include in LocalSettings.php

class mimgclass { };

$mimgo = new mimgclass;
$wgExtensionFunctions[] = 'installmimg';

function installmimg() {
global $wgParser;
$wgParser->setHook('mimg', 'mrender_mimg');
}

function mrender_mimg($mimg, $style='') {
$mimg = preg_replace ('/[^a-zA-Z0-9\/\.]/' ,"",$mimg);
return "";
}
?>

JJ Abrams

media

digital movie making

confessions of a pixel pusher datalab

It is interesting to see how different the current workflows can be. Like
this or that.

Judging by the end this is already from 2003, but still worth looking at:
Robert Rodriguez talking about HD. It was never a fanboy of his,
but I think he expresses a couple of very interesting concepts rather well in this piece.

transmit droplets created from within Interdubs

confessions of a pixel pusher interdubs technology

Running and developing a system in the same time is allot of fun. An idea can be quickly added and / or tried. Some are more
involved though. At this moment there are 42,658 files in Interdubs. So uploading happens allot. There was a ftp interface, but people
need passwords and needed to remember the folder name.

It could be easier. And now it is. It’s as simple as clicking on a link:

A transmit droplet with the proper parameters get created and downloaded automatically. Those droplets can be kept in the dock or on the desktop, and uploading is even easier than it was.

As with so many nice and easy things the underlying technology is actually not that simple. It was great to be able to draw from the resources and experience of the amazing people at Oneiric to get the backbone for this service addition installed. David Green was super helpful, without him this feature would have taken weeks longer to implement. Working with David is allot of fun, since everything he says he will do he does. And it works, since he has tested and checked it from the get go.

It is truly interesting how a small company with people that care can have so much more impact that larger ones that take weeks to move.

Richard Kerris goes to Lucas Film

Apple confessions of a pixel pusher

Richard Kerris leaves Apple to become CTO at Lucas Film

why it won’t work

Sony

Sony pulls another Sony

In their recent PS3 sales success and blu-ray coup I had totally forgotten how thick Sony can be.

suExec fpr Apache under OS X

Apple confessions of a pixel pusher OSX technology

In order to get Apache running with suexec under OS X 10.4.11 and also have php you will need to do the following:

1) get the apache sources. (1.3.39)

2) get the php4 sources (php-4.4.8)

3) extract in the same directory and go into the php one to run:

./configure --prefix=/usr \
--sysconfdir=/etc \
--localstatedir=/var \
--mandir=/usr/share/man \
--with-xml \
--with-apache=../apache1.3.39

make
sudo make install

4) then go into the apache folder and

./configure --with-layout=Darwin \
--enable-module=most \
--enable-shared=max \
--enable-suexec \
--suexec-caller=www \
--suexec-docroot=/Library/WebServer/Documents \
--activate-module=src/modules/php4/libphp4.a \
--suexec-userdir=Sites

make -j2
sudo make install

5) I had to change /etc/httpd/htddp.conf like:

comment out modules in httpd.conf
#LoadModule userdir_module libexec/httpd/mod_userdir.so

#LoadModule php4_module libexec/httpd/libphp4.so
#LoadModule hfs_apple_module libexec/httpd/mod_hfs_apple.so
#LoadModule bonjour_module libexec/httpd/mod_bonjour.so

#AddModule mod_userdir.c

AddModule mod_php4.c
#AddModule mod_hfs_apple.c
#AddModule mod_bonjour.c

Please note that mod_php4 gets added but not loaded. Probably since it got compiled in.
My httpd rejected to start with hfs_apple or bonjour loaded and crashed with userdir.

install apple developer tools in the command line

Apple confessions of a pixel pusher linux OSX

Since years I work on a couple of computers via command line. Since they are real unix computers it all works remarkably well. For a specific solution I need to run osacompile. AppleScript needs to get compiled. I did not find a way to distribute it as text. So finally I got a hold of an OS X machine in the internet. More on that part later. osacompile really wants to run the application that it will talk to later. Also rather odd. But, hey, we talk Apple here. A sect in disguise of a technology company. So everything is possible. Or rather impossible. Like adding a development environment. The Box happened to have no Dev Tools installed. Usually that’s maybe a bit timely but overall straight forward. Installing development tools on a unix computer.
With Apple OS X 10.4.11 it turns out that doing so via ssh is not as trivial. You can download the source code. But first you need to create a developer account with ADC. It’s free. It’s annoying. They keep forgetting my password. Once you logged in,
you could download the dmg file to your local machine. I could have done that and waited only a couple of weeks for my DSL to upload the 900+ MB file to the final server I need it on. Downloading the dmg directly did not work. I had to fake a login. Which is easier as it seems. In the browser that is logged in (firefox I assume) you look for a cookie called ADCDownloadAuth. This you copy paste into the following command line:

curl -b "ADCDownloadAuth=SomeVeryLongCookieString" -O \
http://adcdownload.apple.com/Developer_Tools/xcode_2.5_developer_tools/xcode25_8m2558_developerdvd.dmg

At least that’s the valid file of today.
Once you have the file you attach (aka mount) it via:

hdiutil attach xcode25_8m2558_developerdvd.dmg

and navigate into

/Volumes/Xcode Tools/Packages

to then run:

sudo installer -verbose -pkg XcodeTools.mpkg -target /

Don’t run this against XcodeTools.mpkg in /Volumes/Xcode Tools directly. This results in the error message:

2008-01-09 03:47:43.889 installer[2843] IFPkg::_parseOldStyleForLanguage - can't find .info file (XcodeTools)

which does not google very sucessful.

The install seems to work, from what I can tell so far. I have gcc and make. And that’s all I cared for.

Fucking Apple

Apple misc

We have this Laptop that has a glitch. The backlight goes out if you hold it wrong. It’s an older maching (iBook 600Mhz) and it got replaced. But today I wanted to get it going as a server again. It’s a unix computer. And 600Mhz is plenty to serve a couple of web pages. It runs OS X 10.3.9, that’s as long it has not been updated. Now I try to get it on the Apple Airport. And, that is the problem: It does not. Beachball for a while, and then the error message:

There was an error joining the AirPort network "yournamehere"
Tray again // OK

What the fuck!
Frst: that “Try again” is bullshit. It has never worked. Never ever.

Secondly: What exactly was the error? I am sure the computer knows a little bit more than ‘error’.
The Airport works just fine, so it can talk to two computers right now. So why writes Apple code that is retarded in that it does not give you the slightest hint what the problem might be. I don’t expect a hex dump to be slapped into each users face. But somewhere, maybe in a log file (!) the machine could give me a hint what it would be upset about. It’s one thing that Apple stuff does not work with Apple stuff. But to be quiet about any causes or reasons is just plain stupid and ignorant. Fucking Apple Computers. There are other companies being equally crap. Just that Apple runs around with this attitude of being better and user friendly. Actually they are not. They are just better liars:

Ten years ago Apple introduced the iMac. Which is a great machine. And a great concept. Watch in the end of this 7 minute clip how that man calls a circular (!!) piece of plastic “the most wonderful mouse you have ever used”. It’s exactly this arrogant attitude that makes Apple so annoying.

format peace

confessions of a pixel pusher history media Sony technology

post format war

It is hard to imagine that HD DVD would come back from the blow that Warners BluRay decision delivered. The internet was busy speculating about half a billion dollars in bribes that supposedly that came down rolling Barham Blvd. I think that the sales performance of DVD makes the Studios very nervous. All too quickly they got used to the huge volume of DVD revenue and a steady increase for that matter. The average american bought DVDs for $53, rented them for $25 in 2007. And he/she paid $32 at the Cinema Box office. For both HD formats combined a single dollar left peoples purses in the last year.

In total billions these numbers look like:

16 DVD sales
7.5 DVD rentals
0.3 nextgen DVD formats (both)
9.6 Box office

The troubling point for the studios seems to be that DVD sales are declining. Already in 2005 DVD set top box sales had gone done for the first time in history. Back then it probably was the fanfare about the ‘next thing’. People don’t like to buy yesterdays gadget. The studios felt they needed to get HD via DVD going. And Sony did the better show and number exercises.

Both formats encoding technology, bandwidth and other core parameters are pretty similar. As Mike Curis eludes to, the scripting technology in HD DVD seems to be more open, developer friendly and thefor hugely favorable over the bloated Java based BluRay implementation. But what’s to expect from Sony.

Flat panel displays sales have taken off, and about a year analog TV will be turned off. With the format war being over the Bluray sales should surge. And, I think, they will. Initially. Many bluray players will be PS3s. After correcting the outrageous price Sony’s next gen box had finally some sales worth mentioning. How many people bought the black box because they could not get the cute white one is a different story.

I wouldn’t be surprised if DVD+BluRay Sales volumes would come out flat in 2008 and from there on further decline. There are three reasons for this future disappointment:

* It’s the internet stupid.
Not only the net alone. Technology progresses everywhere. Hell, my toaster wants more attention than it’s great grandfather did 20 years ago. Media is omnipresent. VHS had to compete with, well, Books and TV. Maybe radio, cinema and newspapers. That’s about it. Bluray faces a vastly different world. None of the existing media emanations will just fade away. And new ones get created with an increasing pace. There is simply not enough time to watch all those movies.

* we don’t care since you don’t care
The Studios have failed to understand their own product. There is a history to this. And others failed similarly: The music industry would be in much better shape, would they have not confused the means of peddling circular things with the end of enabling people to enjoy music. Both HD formats allow for better visible quality compared to DVD. Better bandwidth and modern codecs could make for a great experience. Despite this potential most early Discs that were available have been widely criticized for their poor transfers. Some people felt that they would be better off with a decent upscale of a good quality DVD. People love movies. A considerable slice of the population, and almost certainly the majority of early (media) tech adopters care for a good experience. The Studios should have put the utmost emphasis on quality. And that starts with the film transfer. Even though the studios are not keen to involve creative people more than absolutely necessary, they should have gotten them on board for the launch of the new media. Imagine Steven Spielberg approving a 5 movie disk set claiming “this is how I want my movies to be seen”. People would spend allot of money for this. They would get players, lay cable. The whole thing. Maybe the studios should have gotten together with the ACE and directors guild to develop a approval system. Pay directors and DPs to sign off on a DVD transfer. I would pay happily knowing that the creative vision was intact. OK, in some cases I would simply paying for the drug habit of that one hit wonder boy. But I do that anyway, one way or another.

* it’s complicated
HDMI 1.3 is really exciting, since it not only features greater than bitdepths but also could carry the extended xvYCC color space. While being true, not many people know what this means. And neither should they. DVD succeeded because it was ‘as simple as CD’. No more rewind. That made Hollywood billions. Simple is key. The HD formats are not exactly known for simplicity. And the studios are not helping. Neither do the hardware makers. I find my way around these matters. But it’s my job to understand all this. And if it wouldn’t, then I would really watch another movie than to worry about downsampled movies that were escaping DRM through the analog otherwise. Having two formats was of course a big problem. But even with BluRay remaining it’s not as easy as it should be. Different disc sizes. Flat panel resolutions. Frame rates.
Image processing. And an interface written in Java simply scares me: There are just too many ways developers mess up. Hardware makers and studios alike fall in love with features that have nothing to do with their product. Multi Angle was one of these technical possibilities that DVD had. Studios were all excited about it. Since they didn’t understand what their product is: A movie is one view. One perspective. Everything else is a cute vaudeville attraction or plain and simple porn that desperately tries to stand out (no pun intended).

DVD hardware sales

Variety on DVD sales numbers
2007 Box office