New Zealand leading the way in draconian copyright infringement law

Posted on Oct 18, 2008

Us New Zealanders pride ourselves on being innovative and ahead of the world. We are the first to see the light of a new day, we were the first to give women the vote, and now we have the dubious honour of being the first country in the world to pass into law a three strikes and you’re out approach to online copyright enforcement.

Essentially the bill states that ISP’s are now responsible for "reasonably implementing" a policy to disconnect users internet services in "appropriate circumstances". As you can see from the wording of the act it is vague to the point that ISP’s are unsure of how they are even supposed to enforce the new law. According to Telecommunications Carriers Forum chief executive Ralph Chivers, “Section 92A has achieved one thing, and one thing only, uniting the ICT sector and others who will be affected in an unprecedented show of solidarity against it”. He went on to say “The Act gives no guidance on what ‘reasonably implement’ or ‘in appropriate circumstances’ mean. This leaves the door wide open to those who seek disconnection of an alleged repeat infringer based on flimsy evidence, or worse, allegations alone.”

So… ISP’s don’t want it, consumers obviously don’t want it, as usual the only organization driving the adoption of these sorts of laws is the Recording industry MAFIAA, in this case the New Zealand equivalent, the RIANZ. In a quote that pretty much sums up his contempt for the very people who keep his industry in business RIANZ CEO Campbell Smith said that it would be"impractical and ridiculous" for copyright owners to prove the guilt of infringers in court before demanding they be cut off from the Internet.

Whatever happened to the idea of being innocent until PROVEN guilty? It seems that once again rather than reinvent their business and distribution models, in a way that benefits thier consumers, the MAFIAA would rather play the role of McCarthy, accusing us all of being communists pirates based on frivolous and spurious allegations until we fall into line.

With elections coming up on November 8th, make repealing or reworking this act an issue and ensure that we don’t lead the world when it comes to eroding our rights to appease industry lobby groups.

…Or failing that, send a few hasty accusations of copyright infringement the RIANZ’s way, see if they like being disconnected without proof :)

Hearing voices

Posted on Aug 15, 2008 dotnet programming

A quick post about a fun little app I wrote a few days ago (with potential to cause all sorts of awesome pranks to the unsuspecting). It allows you to remotely send text to the host computer which will then be read out on the speakers using microsoft text to speech. Its essentially just a windows service which runs an http server, to send speech you just need to make an http get request to the server url and enter the text as a query string e.g.

http://localhost:8080?text=hello world

The http server also accepts http POST’s and assumes that the contents of the post contains the text to speak. This tool is not only useful for pranks, its easy to integrate it into notification systems i.e. broken build or code check-in notifiers.

You can find the source for this application here

Silverlight, one step forward, one step sideways

Posted on Jul 11, 2008 silverlight programming

I finally got around to updating my silverlight Amazon.com search engine Tarantula to work with silverlight 2 beta 2 (see it in action here). This update included some relatively minor (as opposed to the wholesale changes from 1.1 alpha to 2.0 beta 1) though essential enhancements. Most of the changes that have been made have been in the area of control templates which were a bit lacking in the first beta release. The introduction of the visual state manager is great and allows for smooth transitions from one control state to another, so I could re add the nice button highlight effects present in the original alpha version of tarantula but which I had to remove in the first beta.

However the reason that I took so long to update Tarantula to work with beta 2 is that Microsoft in their wisdom decided to change the format of remoteaccess.xml files which silverlight applications will accept. To cut a long story short this means that the remoteaccess.xml files used by the amazon.com web services and many other web service providers is now incompatible with silverlight beta 2 clients, rendering those services inaccessible.

So to remedy this I had to write a proxy for the amazon.com services that was hosted on an accessible domain. However I could not be bothered writing a wrapper for the Amazon.com web services (like I did for the alpha version of silverlight) So I wrote a general purpose soap proxy component to do the work for me.

It works as an ASP.Net httphandler and maps local proxy endpoints to their real locations elsewhere on the net. To get it working all you need to do is have an ASP.Net website that will operate as the proxy, add a few config elements to the web.config, then point the silverlight client to the proxy address, and it will all just work as if you were communicating with the services hosted on an inaccessible domain.

In addition to the usual web.config changes to add an httphandler (detailed in the instructions bundled with the soapproxy components download) I had to add the following config entry to the tarantula.sharpoblunto.com site to map the proxy end point to the real amazon service endpoint

<soapProxyComponent>    
<endPointMappings>
<mapping proxyEndPoint="amazon.ashx" remoteEndPoint="http://soap.amazon.com/onca/soap?Service=AWSECommerceService" />
</endPointMappings>
</soapProxyComponent>

Then in Tarantula I had to change the service endpoint in the ServiceReferences.ClientConfig to point to the proxy address instead of the real endpoint.

<endpoint address="http://www.sharpoblunto.com/amazon.ashx"    
binding="basicHttpBinding" bindingConfiguration="AWSECommerceServiceBinding"
contract="Tarantula.AmazonWebService.AWSECommerceServicePortType" name="AWSECommerceServicePort" />

In practice it’s worked perfectly. I hope this tip comes in handy for those trying to access third party web services from silverlight beta 2 apps.

Regarding the reconstruction of a site with additional dynamism

Posted on Jun 20, 2008 dotnet programming

Of late I’ve done a redesign of the home page for one of my side projects Junkship. While most of the site was simply static content that I migrated from PHP to asp.net on a new hosting environment, I did add a few new spiffy features around the sites image gallery. Because I’m not a fan of the bloated ASP.NET AJAX framework with its update panels and such, I went for a light weight approach to adding Ajax functionality (I would have liked to redo the whole site using the excellent ASP.NET MVC framework but alas my webhost only supports asp.net 2.0 sites).

image_thumb[6]

All Ajax postbacks are done using the prototype library which provides a simple and clean means to make Ajax requests to the server, at which point the server renders the desired controls and sends back the html, which prototype can then inject into the page DOM to be rendered.

Returning html controls from AJAX postbacks has a few nice advantages over returning raw xml data that has to be parsed and rendered on the client through JavaScript processing. Firstly it reduces the JavaScript processing that has to be done in order to update the page, as all the client has to do is inject the html subtree into the DOM rather than build its own by picking bits from the response xml. The other nice advantage is that the control rendering only needs to be written once on the server side, not duplicated on the JavaScript side.

An issue that comes up when designing Ajax enabled pages is the issue of the browsers back button, namely the breaking thereof. For my image gallery I wanted the back button to take you back a step, but not necessarily to the previous page as you can browse between many images on the same page via Ajax calls. To make this possible I tried to write my own solution by injecting anchor tags into the page via JavaScript, which worked…. on firefox and not much else :). I decided that surely someone else must have already written such a framework, but done a better job and made it cross browser, and it turns out I was right. Its called the RSH framework (Really simple history) and it allows you to specify when you would like to create a snapshot of the page that you would like to return to when the user uses the back or forward buttons. You can add as many of these points as you want and there is a callback system which notifies your JavaScript of page back/forward events and allows you to restore the desired page state based on the snapshot data. It works really well in practice (though I couldn’t get it to work on Safari which isn’t great because I use safari a lot now on my ipod touch) and means that the page behaves as you would expect.

You can view the gallery (and my awesome artwork :)) here

Silverlight redux

Posted on Apr 16, 2008 silverlight programming

After my earlier experiments with the alpha releases of Silverlight 1.1 I was extremely keen to give the newly released Silverlight 2.0 Beta 1 a run for its money and port my Tarantula silverlight application from Silverlight 1.1 to silverlight 2.0.

image[9]

Now This was easier said than done, not because silverlight 2.0 is difficult to work with (compared to the alphas its a breeze) but because pretty much everything has changed.

The addition of user controls (no need to create custom text boxes anymore!) has meant that the xaml markup is a lot closer to what you’d find in WPF, with grid and stackpanel layouts, control styles, and control templates. This means that you can now separate style from structure in the xaml much more effectively and apply consistent styling to user controls. Unfortunately there is a lack of events that control templates expose, for example on mouse exit is not exposed for button templates meaning that fade out effects on buttons are not possible without custom code/markup.

Whereas before silverlight app’s were deployed as a collection of DLL’s and xaml files, they are now deployed into a .xap package, which is just a zip file containing the DLLs, but its a lot tidier and shrinks the download size for silverlight app’s.

Web service support has drastically improved since the alpha releases. In 1.1 cross domain http requests were not allowed meaning that in order to create a silverlight mashup utilizing external web services you had to actually implement a server side web service proxy to the external services. This was a huge hassle and has thankfully been done away with in 2.0 as you can now make cross domain requests provided that your domain is permitted by their cross domain policy file. Another annoyance in 1.1 was that only JSON web services could be consumed. this has been changed and now it is possible to consume WCF and standard SOAP web services.

image[14]

The result of all these changes was that I had to pretty much rewrite all my xaml, and the back end web service code. However thanks to the wonders of the Model View Presenter (MVP) pattern, In particular the Passive View variation of MVP I didn’t need to change any of my application logic. The other great thing about the Passive view pattern is that because the View can easily be implemented as mock objects (using Inversion of control and dependency injection) the controllers can be completely unit testable (though I was to lazy to implement any tests :)).

The final result is that my application is functionally identical to the original application, though it was much easier to get it all up and running this time than it was the first time, Silverlight is progressing nicely and I am looking forward to the next release. You can find my application here and source code is available for download here

Latest tweet