Wii API’s

December 24, 2007

I have been standing in line for a while now to get a chance to try out the Wii at Makro. Unfortunately so are quite a few children. I may just have to buy one untried now that I have seen the Wii Opera SDK. This includes:

Wii Remote detectionRemote Demo, VR Head Tracking Demo

Receive status Wii Remote buttons, pointer coordinates, sensor bar distance, and Z-axis roll.
3D rotationsCube Demo, Rippling Water Demo
Rotate polygons in 3D space then translate them to z-sorted 2D to add that extra dimension to graphics.
Drawing effectsShip Demo, Wall Demo, Floor Demo

Draw lines, circles, rectangles, tiles, texture-mapped walls, and more.
Multiuser Communication
Allow multiple players/users to take part in the same of software.

Download the SDK: Wii Remote, Graphics, 3D Math (General | FPS)

View the documentation.

also check out the wii remote api which the Opera folk have released. Allowing you to be able to monitor all of the remotes that are connected with The Wii Remote API.

  2. //Obtaining the roll of the third Wii remote in degrees
  4. var remote, roll = 0;
  5. //check if the browser provides access to the Wii Remote data
  6. if( window.opera && opera.wiiremote ) {
  7.   //get the KpadStatus object for the third Wii Remote
  8.   remote = opera.wiiremote.update(2);
  9.   //check that the remote is enabled
  10.   if( remote.isEnabled ) {
  11.     //get the roll angle in radians
  12.     roll = Math.atan2( remote.dpdRollY, remote.dpdRollX );
  13.     //convert the roll to degrees
  14.     roll = roll * ( 180 / Math.PI );
  15.   }
  16. }
  18. // Checking what buttons are pressed on the second remote
  19. var remote, buttons = {};
  20. //check if the browser provides access to the Wii Remote data
  21. if( window.opera && opera.wiiremote ) {
  22.   //get the KpadStatus object for the third Wii Remote
  23.   remote = opera.wiiremote.update(1);
  24.   //check that the remote is enabled
  25.   if( remote.isEnabled ) {
  26.     //use the bitwise AND operator to compare against the bitmasks
  27.     buttons.pressedLeft = remote.hold & 1;
  28.     buttons.pressedRight = remote.hold & 2;
  29.     buttons.pressedDown = remote.hold & 4;
  30.     buttons.pressedUp = remote.hold & 8;
  31.     buttons.pressedPlus = remote.hold & 16;
  32.     buttons.pressed2 = remote.hold & 256;
  33.     buttons.pressed1 = remote.hold & 512;
  34.     buttons.pressedB = remote.hold & 1024;
  35.     buttons.pressedA = remote.hold & 2048;
  36.     buttons.pressedMinus = remote.hold & 4096;
  37.     buttons.pressedZ = remote.hold & 8192;
  38.     buttons.pressedC = remote.hold & 16384;
  39.   }
  40. }



Twitter API

September 11, 2007

From Read/Write’s interview with Twitter co-founder Biz Stone

Biz Stone: Yeah. The API has been arguably the most important, or maybe even inarguably, the most important thing we’ve done with Twitter. It has allowed us, first of all, to keep the service very simple and create a simple API so that developers can build on top of our infrastructure and come up with ideas that are way better than our ideas, and build things like Twitterrific, which is just a beautiful elegant way to use Twitter that we wouldn’t have been able to get to, being a very small team.So, the API which has easily 10 times more traffic than the website, has been really very important to us. We’ve seen some amazing work built on top of it from tiny little mobile applications like an SMS timer that just allows you to set a reminder over SMS to call your mom or something like that, to more elaborate visual recreations of Twitter like twittervision.com which shows an animated map of the world and what everyone’s doing around the world with Twitter. Twitter is popping up from Spain and Japan and United States.And that’s very, sort of like, “Look at that!” It’s like staring at a fish bowl or something – an aquarium. You just find yourself getting lost in it. The API has really been a big success for us, and it’s something that we want to continue to focus our efforts on, looking forward.

Wikipedia API?

August 11, 2007

A team of German university researchers are putting together an API for the Wikipedia. They describe it like “DBpedia.org is a community effort to extract structured information from Wikipedia and to make this information available on the Web. DBpedia allows you to ask sophisticated queries against Wikipedia and to link other datasets on the Web to Wikipedia data.”

What it seems that they have actually done is extracted data from the wikipedia and have it available online via their API, using SPARQL to query against this data.

From their Introduction:

Wikipedia is the by far largest publicly available encyclopedia on the Web. Wikipedia editions are available in over 100 languages with the English one accounting for more than 1.6 million articles. Wikipedia has the problem that its search capabilities are limited to full-text search, which only allows very limited access to this valuable knowledge-base.

Semantic Web technologies enable expressive queries against structured information on the Web. The Semantic Web has the problem that there is not much RDF data online yet and that up-to-date terms and ontologies are missing for many application domains.

The DBpedia.org project approaches both problems by extracting structured information from Wikipedia and by making this information available on the Semantic Web. DBpedia.org allows you to ask sophisticated queries against Wikipedia and to link other datasets on the Web to DBpedia data.

Wikipedia articles consist mostly of free text, but also contain different types of structured information, such as infobox templates, categorisation information, images, geo-coordinates and links to external Web pages. This structured information can be extracted from Wikipedia and can serve as a basis for enabling sophisticated queries against Wikipedia content.

The DBpedia.org project uses the Resource Description Framework (RDF) as a flexible data model for representing extracted information and for publishing it on the Web. We use the SPARQL query language to query this data.

The DBpedia dataset currently consists of around 91 million RDF triples, which have been extracted from the English, German, French, Spanish, Italian, Portuguese, Polish, Swedish, Dutch, Japanese and Chinese version of Wikipedia. The DBpedia dataset describes 1,600,000 concepts, including at least 58,000 persons, 70,000 places, 35,000 music albums, 12,000 films. It contains 557,000 links to images, 1,300,000 links to relevant external web pages, 207,000 Wikipedia categories and 75,000 YAGO categories.

Those of you that have been following this blog, and its predecessor ikissnoise for a while, will know that I have been a big fan of LinkedIn as a social network for business people. They do seem to be lagging behind the rest of the social networking crowd in terms of features. A recent announcement that they will be releasing an API within the next 9 months has been met with some criticism that it is too little too late.

So what other strategies are open to someone who has seemingly missed the API bus? Jeremiah Owyang recently posted some interesting thoughts on his blog.

One key feature I see that LinkedIn from benefiting is to become the online source of the resume, not just the networks that are connected to the jobs. Help users to answer; “what skills have I learned, who else has them, where can I find others with these skills”. There’s an opportunity to expand the tool as the online resume.

If LinkedIn is to become the premiere social networking tool for businesses (as stated in this article) then they need to consider joining all the communities that existing in the context of business. If I were working at LinkedIn, I would be pushing an API to Facebook quickly and also universal login that web managers could integrate into their site. This identity systems could feed into recruiting systems, monster.com and even the ‘career’ pages on corporate websites –let me fill out my core information (or different versions of it) once and submit to many. It’s an API really, and would actually be a competitor to some identity management systems, almost like OpenID.

I believe that if LinkedIn doesn’t open an API sooner than 9 months, they may be falling back further than they think. Although the hResume move was interesting strategically as hResume has not been widely adopted yet.

I am possibly moving to Dallas, Texas, in the next couple of months, and I have been looking around at the real estate market on that side. I’ve been really impressed by quite a few APIs, applications and search engines that have sprung around the real estate market. It seems to be a very exciting area in terms of APIs, visualizations and software, I suspect because the data has always been there and has been captured and analyzed to death, now its time to rework the interface.

Anyway, I came across this very interesting visualization called Hindsite by an real estate search engine called Trulia. It is based on public property assessor records for properties in Trulia’s database and typically includes the date that a house was built. Check out this amazing visualization of the Dallas/Fort Worth area, specifically Plano. It is certainely making searching for a house a whole lot more interesting.

Real Estate information and mapping are ideal partners for a mashup, and besides using Microsoft Virtual Earth for Hindsight, Trulia has a very strong integration with Google Maps for their search engine. All around a very nice application. And they even have an API.

Yesterday eBay made a series of announcements regarding new APIs and developer tools, calling for the company to rebuild the technical guts of its eBay.com site as a series of modular services, rather than a single, unified application. Today, David Berlind chipped in with an interesting analysis of web based APIs from Google, Amazon, Yahoo, Microsoft, AOL and eBay becoming the new platform providers in the way that in the desktop era it was the operating systems of Windows, Mac and Unix that provided the primary platforms for applications.

In a sense what he is saying is that if we compare the API providers to Windows, or Mac, then the next step in application development is the user interfaces that are being built on top of those APIs. Microsoftware.

the Holy Grail for companies like Salesforce, eBay, Amazon, Yahoo, Google, et alia: to be in the infrastructure business but to let developers be the ones that drive adoption through innovation. Sure, if you’re one of those or other API providers, it helps to provide prototypes or something that’s minimally functional to get new users started. But when I look at where Google is going with Google Apps (of which Google Docs and Google Spreadsheets are only a part), my sense is that there are innovators out there that will come along and build user interfaces on top of Google’s APIs that are far more compelling than Google’s native interfaces.

So where does that leave the nonprofit looking to leverage current trends and forthcoming trends? For a second stop thinking of the organisation as single unfied organisation based on a single platform. Start thinking of it as a technological enabler for a cause. The tools to enable that cause are developed and then the constituents are given these tools to develop their own version of the tools on top of them. In the way that Excel allows you to develop your own spreadsheets, rather than in the way that eBay releases a new API. If your constituents are savvy enough, you can always give them an API as well.

btw if you read through to the end of David Berlind’s article one can not help but think that he is arguing that the business to be in is the business of creating a better UI. I think Joel may agree.


June 2, 2007

Micrsoftware, nothing to do with Microsoft, refers to small discrete bits of software that rely on an established piece of software or an API. A classic example of this is Joel Spolsky’s copilot project. Its a small discrete piece of software that uses the open source VNC project as its core. A piece of software that uses an established API can also be seen as microsoftware like Agent Earth which allows users to browse real-estate data by location in Google Earth. Even TubeMogule which allows users to track online video analytics across online video sites including Google Video, MetaCafe, MySpace, Revver and YouTube can be seen as microsoftware.

Developing microsoftware ensures that you have a ready audience for your product. It also often means that the heavy work is already done by the parent product and that your microsoftware is adding value via either a different interface, a mashing of two or more data sources, simplifying an existing process, adding a new process or creating new data based on the parent product. Plugins, software dependent on APIs, toolbars, widgets and software that gives the parent product a new UI can also be seen as microsoftware.