Ada

This post is undertaken for the Ada Lovelace Pledge.

I’ve been fortunate to work with a number of brilliant women in technology.  Mostly those I met at the BBC including Paula La Dieu, Alice Taylor, Anne Fairbrother, Priya Prakash and Anno Mitchell and those I only met fleetingly but who’s reputation and work was well known, like Fiona Romeo and Kim Plowright. (The BBC had a clutch of women who were outstanding and perhaps this rubbed off on me as I nominated a meeting room at the BBC to be called Ada Lovelace. It was.).  Beyond my immediate working environment I’ve been influenced by writers like Geniveve Bell and Kathy Sierra (tho better in print than speaking methinks) and of course danah boyd.  I would choose danah for this Ada day but I imagine she’s incredibly popular already and doesn’t need another blog post.  Someone less well known who’s been at least as significant an influence as danah is Susan Leigh Star and she doesn’t get mentioned much, so here you go. I’m a fan of Actor Network Theory and Susan Leigh Star was someone who took that methodological approach to look at technology in all it’s forms. She deconstructed it and rebuilt it again in some marvellous ways.

Her book “Sorting Things Out” is still an inspiration in looking at people classify things and the daily politics involved in that classification process.  It’s not Technology with a capital T and I think that’s what I learnt most from Susan’s work.  Technology doesn’t exist as such, it’s just in-human stuff, not that calling it that denigrates it in any way, rather it serves to empower it by showing that materials have the potential for effects rather than being ‘objects’ subservient to humans with ‘agency’.  I wouldn’t usually advocate an academic, because on the whole I think they’re not that useful, but Susan’s work has stood out for me as being practical and insightful  and despite its subject matter not always being that sexy she manages to make her work interesting. 

That’s it.  Well done Susan, there’s a trophy waiting for you in Sheffield. ;)

There are heap of video playouts now.  What I quite like about Hellodeo.com is that it takes the idea of presence further than its peers by hooking into your AV straight away and recording.  It’s a [all too] simple process. And judging from the featured material on the site it produces quite unreflexive, more immediate encounters with fellow global citizens.  This is a mirror.  <Hello foreign man singing>. And it’s all so refreshingly basic which won’t help when it comes to create data that’s social but it makes for a far more pleasant visual experience.  Is anyone else tired of web2.0 publishing ‘features’?  When do these ‘features’ become pretty lame marketing gizmo’s? I mean I understand the appeal of embedding metadata in the web around an object but I also understand the egotistical affordance of button-mania, like badges on the laptop of a die-hard loser.  Like the Groucho Marx quote, being a member of a club that would have me [and the rest of the world] as a member isn’t that enticing. 

In the past I’ve never had a say in what was the most appropriate script to use to build something.  This is mainly because other people have been far better qualified than I to make the decision but also because it’s never really been in my interest to know.  So long as it worked and adhered to the ‘experience’, the features and functionality we specced then like, whatever. Recently though I’ve been developing an idea to form the basis of a service and it has been in my interest to know [because I'm carrying the risk] mainly because it’s near impossible to get an ‘agnostic’ view of the most appropriate language in which to develop.  Key factors for me:

  • development time
  • scalability
  • transferability [i.e. is this stable? how easy is it to maintain? is the language well known or is it like Latin and dying out fast? or even worse like some street vernacular that isn't even documented properly and unknown by all but a handful of very well paid scruffs ;) ]

Yet trying to get a straightforward answer around such criteria without actually doing a full blown tech scope and incurring the cost of that, was and is tough.  Developers and technical project managers are steeped in what they know and that can become obsolete or partial very quickly as languages adapt and change.  It’s a big issue.  I know the BBC  had there architecture pinned down early, were heavily dependent on Perl, all for good reasons, but now find it hard to adapt to web2.0 functionality increasingly expected of the younger audiences they covet.

Hence I’ve had my head in lots of primers over the last 4 months.  Trying to get up-to-speed with languages like PHP, Python, Ruby etc.  It’s helped but it’s not the best use of my time and there has to be a market for people like me looking for a good overview of the foundations of web development both client and server side.  Maybe there already is, but I don’t see it. This is one of the first ‘lay’ views of the language of web 2.0 and their ‘facets’:

Brayintrinsics_copy

Dev time seems to be a straightforward trade-off with scalability and transferability which makes languages like Ruby great for start-ups looking at proof of concept and limited scale.  Presuming once they’ve proven the business model they can invest in re-developing the application or service, underpinning the house and putting the scaffolding up.  I do find it kind of ironic though that in the search for ‘purity’ amongst developers [and they all talk about how 'pure', 'clean' etc. and use of such metaphors is in itself really interesting cf. Mary Douglas' work] the architecture of the web is actually pretty messy despite or perhaps because of the search for such ‘purity’.  And to say that much of our future could depend on that architecture I’m surprised there’s so little discussion of it beyond enterprise level as bodies like W3C concern themselves more with accessibility standards than building regs.   

Link: PHP vs Java vs Ruby.


Skimming Block, originally uploaded by superlocal.

I love the Far East.  I love the way they just get on and make stuff and then make more stuff to solve the problems of the initial stuff.  There’s no prevaricating, they just keep making instead of legislating, which is what we’re so good at in Europe.  Here we have a skimming block for RFID [which is embedded in credit cards over there and probably soon to be here].   
Makes me think of Matt Webb’s recent post on the excellent Pulse Laser blog about the rise of distributed manufacture in Japan and the use of interchangeable parts to make a complete product, rather than it being manufactured by one organisation.  This skimming block works the same way but for situations or experiences of the individual person -  in that rather than solving the problem for all in the ‘host’ technology [so working with all RFID chip users] you have interchangeable solutions for the range of technologies employed by the user.
In some scenario planning work I did whilst at the BBC one of the stories had the rise of technical ‘plumbers’ to solve problems you had with interoperability, or rebellious technology.  Workarounds necessary for the myriad of different socio-technical relations that emerged in the digital age.  That service and the sorts of products as this,above, seem increasingly plausible  cf. the discourse of the future which had all our ‘technology’ as pure, whole and inter-operable. But is the UK economy set up for that or are those products and services going to be imported or offshored?

Early novice engaged twitter behaviour and networking giving way to background, lower level use.  Particularly noticed scale limits: 25 people is my ceiling.  Phone notifications off after quiet walk in the woods was disturbed by busy aviary like twitter from pocket.  I had 42 new texts in the space of 30 minutes.  That’s when continuous partial attention becomes proper attention and bugs me.  Still, I’m intrigued by the comms structure across different mediums and the affordances of those mediums – AIM can ping away at will but phone not good at background media [though obviously that is a huge generalisation and I'm ignoring context  etc. and should be whipped].  Twitter is one of the few instances where you can see how the narrative, fragmented and distributed such as it is, plays out differently across space and time and medium.   And in that sense actually ‘doing’ social media like twitter is far better than reading about it [and tends to be my recommendation to clients which kind of puts me out of further work]. 

However, that said I’m going back to read Dourish and ‘Where the Action Is’ and enjoying him far more this time around.  Had not given him credit for cutting through vast swathes of impenetrable social theory whilst remaining lucid and relevant and only occasionally disappearing up his derrière.  I’m particularly enjoying reading the book with reference to Wittgenstein’s "Language Games"; the construction of meaning through shared ‘interaction’, through ‘doing’ rather than a pure expression of some inner mental state. Of course when you take up using a new social media like twitter and the ‘rules’ take a while to play out, there’s a lot of negotiation and in many ways that’s the most fun ‘bit’.  The imperfection makes it perfect, at least for a while. 

I find it quite staggering how much dotcom M&A activity going on right now.  IPOs seem to be coming through daily in a race between the big four Google, Yahoo!, News Corp and Microsoft not to lose out.    All seem to be spending over revenue growth in activity not seen since the boom/bust days of 1997-2001

Of the biggies, Google seem to me to be the main player as they continue to milk overseas markets and gain profit increases – up 11% this year mostly due to a strong 2nd half – and this despite Yahoo!’s 37% drop in profits posted last quarter due to a decline in corporate advertising online:

Google-owned Web sites generated revenue of $1.63 billion,
or an 84 percent increase from $885 million in the third
quarter of 2005. Google earned 60 percent of its total revenue
from its own sites, versus 56 percent a year earlier.

Revenue from affiliated Web sites using Google’s AdSense
programs accounted for 39 percent of revenue, or $1.04 billion,
a 54 percent rise over year-earlier revenue of $675 million.

Google’s growth rate is two to three times faster than its
Internet peers: eBay Inc. (Nasdaq:EBAYnews) at 31 percent, Yahoo Inc.
(YHOO.O) at 19 percent and major software rival Microsoft Corp.
(Nasdaq:MSFTnews), which has revenues growing at about 10 percent.

And this disparity is down to the strength of ppc ads rather than discretionary banner ads, which Yahoo! relies on to serve up to its rising number of users.  As ad spend drops discretionary spend is hit hardest. However, Yahoo! should come back next year according to The Examiner:

Wall Street has been frustrated with Yahoo for most of this year,
largely because the company hasn’t been targeting online ads as
effectively as Google Inc., the Internet search leader that runs the Web’s largest marketing network.

Yahoo’s improved ad platform, code-named "Panama," is designed to close the gap with Google.

But it’s not just that Google’s revenue growth and search platform is so strong but they genuinely seem to have a strategic roadmap that the others lack [though I suspect Murdoch has a vision].  The brand mantra to "organize the world’s information and make it universally accessible and useful" is backed up by their 10 things they know to be true of which "do one thing really well" is spot-on.  They do search whereas the others have been pulled and swayed in other directions. And search pays.  But they’re also building up an enviable swathe of GTD browser-based products [buying up writerly etc etc], providing chat, docs, spreadsheets and calendar to organise your life around them; they’re becoming indispensible.  No-one else offers such interoperable life support.  So Google  on the one hand  allow  advertisers to do ppc well and on the other they’re building  lifetime utility based value amongst users from which to serve up ppc.  Yahoo!’s  many excellent services [del.icio.us and flickr]  don’t hang together in the same way; they remain disparate stand alone utilities.

Be interesting to see how widgets [what Microsoft calls gadgets] take off next year  with their support in Microsoft Vista etc.  Google isn’t really set up for widgets in the way that Yahoo! is with it’s purchase of Konfabulator, now Yahoo! widgets.   The melding of webservices and client  app  possibilities in widgets for the development of new services is massive… and perhaps this is where Yahoo! has a chance to bite back?

The ‘life beyond the broadcast’ for BBC Weekenders seems quite healthy.
146885952_f195b5bb99
Moylesy taken from kc_mcfen’s on flickr.  See the pool which with 222 members and over 2300 pics as of today is up there with the Japanology pool!  Woohoo.  See also the weekender tags and a rather wonderful set of from ruu as well as Radio 1′s own polished yet somehow dull set [perhaps because it feels more 'produced' amidst the more edgy images from the audience].

The distributed media malarky allows us to experience and engage with the event in so many different ways as they ‘folded’ back into the event and then existed as a life beyond it.   Of course that brings some teething problems, not least is distributing resources to the edges where the communities have distributed to! Following conversations to manage any potential problems is one of the key issues with an ‘open’ and inclusive media strategy like the one the BBC employed here.    This open approach was exposed somewhat by the use of flickr discussion groups to get answers when the postie scam hit .  It would seem that flickr is the pre-eminent platform for  extending mainstream social experiences online.  Perhaps Flickr’s move from ‘beta’ to ‘gamma’  [lol] is recognition of this fact, that and their re-design.

Flickr_gamma_1

It’s not just flickr of course.  The big news from the Weekend was simulcasting the event in Second Life.  Despite the PR, actually perhaps because of the PR from the Second Life experiment I’m left a little disappointed by the fact that it seemed so, well, er, dull.  Am I allowed to say that?  The seeming adulation afforded MMOGs is almost cult like and as an outsider looking in it can’t match the hype.  Innovative, sure in the sense that it utilises a different platform to showcase the event.  But where’s the playout from the ‘event’ in game? The ‘ripples’ don’t seem to have a different ‘life’.  And where’s the social innovation in game?  Perhaps I’ve missed something.  Oh. No. There it is, there’s the Daleks! :-)

146078615_fa29253b2a

BBC Radio 1 Big Weekend VirtualFestival uploaded by Louise from the makers of the BBC Second Life event RiversRunRed.   

Perhaps I’m expecting too much but it would have been great if there had been some offline playout of the game at the event [there was a screen apparently].  This could have simply been in the form of  a conversation to develop between those at the offline event and those in game, so that the ‘tension’ between the experiences were exposed introducing a reason for dialogue.  It wouldn’t be easy but a rather crude model like the subservient chicken shows how calls to action from both communities could be initiated.  The big screen relaying the Second Life event exposes messages to the people watching at the offline event "dance like a dalek" etc etc.  There’s a level of sophistication in the creation of the experience that I clearly haven’t had time to work through :-) but you get my gist.  Shortcodes to txt back into the game and send images of "dancing like a dalek" could all work.  It’s just one very simplified thought.  More interesting ideas start to come through in gaming an event itself… cues set in game that people have to solve in the physical environment of the event to determine which acts perform, when, what etc… or making a mainstream viscereal experience akin to that shown by Blast Theory  in Uncle Roy All Around You early last year.  Experience design on that level gets really really exciting. 


spies, originally uploaded by JamesB.

The BBC has opened up its catalogue going back to 1937. Well Done MattB and Tom. It’s a wonderful thing.

I worked in TV as a researcher/AP for the BBC between 1999 and 2002 first on Timewatch [for  a lovely bloke called Tilman Remme] and then in Current Affairs [and much later a short stint at Newsnight]. It’s great to be able to get the info. Though I don’t remember getting contributions from Hitler, Kim Philby and Anthony Blunt for this film – my favourite ;-) [methinks the 'contributors descriptor is a little generous as it covers production staff, historical figures and interviewees]. I had 3 glorious months at the National Archives in Kew researching this one, my first job for the beeb.   Uncovering the many great stories that came out of the IIWW.  The film couldn’t do justice to the richness of what we found.   Of course there are bound to be issues with the cataloging – not least of which is that I don’t get a credit here grr… Still, you got to love those librarians.  Working diligently, without any of the glamour that most people see as a perk of working at the beeb.  They’re kinda machinic in the way they construct and adhere to strict data protocols.  I suppose in that sense they’re very high level code really.  Respect!

So the BBC is plugging into the world.   It’s all been said before but how great it is to be able to pull out and aggregate content by contributor, date, series, episode, channel etc and work that into bottom up data from other sources like wikipedia entries for shows, presenters, events. 

I see a useful application in this as a social documenting tool.  To be able to visualise the key ‘memes’ in broadcast by the BBC by year and see how that correlated with wider events.  Is the BBC a useful barometer for the zeitgeist?  Does it lead or follow?  Could be quite fascinating…

[thinks - should have worked at a red top.]

Spent a great day at the BBC Labs in Masham, Yorkshire, yesterday courtesy of Matt, working with the various teams and mentors who were finalising their propositions.  It was one of the best days I’ve had in a long time, being around a lot of creative technologists under pressure to produce something of value.  But more than this hot-housing environment it was really refreshing to be around people who just wanted to make things, sure they were talking about some of the latest memes and mash-up stuff but they had their feet firmly on the ground producing something rather than jumping on an idea because of its faddish currency.  Doing stuff is so satisfying. 

As a sausage machine for innovation the BBC lab is a particularly nice one: pleasant surroundings, 5* accomodation  and high ceilings.  Not that you need plush surroundings to make beautiful things, but it must help.  We’ll see.

The lab reminded me of my first year at the BBC in 2000 when Frank Boyd, [Artec, AudioRom]  who ran this lab, ran the first ever BBC Interactive Lab and I was invited on it by BBC Current Affairs where we produced an interative TV programme [what now would be termed enhanced TV] called "Interactive Pope" [it sucked]. But the lab was fantastic – a coming together of a great many different skills, abilities and attitudes and booze.  This was a similar, if far more sober, affair reflecting our more New Labour, puritanical times and perhaps a more ‘serious’ BBC.

I think there’s a model here in these labs that AudioRom were pushing in the late 1990s that is so now. There are more technologists that are able to make things quickly, creatively and the landscape is more stable on which to build.  I wouldn’t be surprised if we see more institutions/collectives outside of those zany folk in San Fransisco ‘doing’ labs in the next year or two in the UK. It’s a great, cost effective, way to reach out and innovate.

I was somewhat disappointed last year at Etech and wasn’t able to justify the cost in time or money this year to myself.   However, it would seem I’ve missed something potentially massive, but massive in an unassuming, quiet revolution sort of way:  The "Live Clipboard" [via Ben].

Microsoft’s dev team [at the instigation of Ray Ozzie] have worked on the basic principles of web publishing and higher level ‘scripting’ mashups using structured data to think through how you could create a schema for the common user to create their own ‘mash-up’.  It’s a very simple idea based around the idea of the ‘clipboard’, arguably the most useful yet overlooked function of windows based environments.  Creating a standard data format around copying data and pasting it into different browsers / applications you have the potential to make things anew.  It’s about applying the concept of the clipboard to the web, things like calendar events, lists, shopping items etc.  Of course some of this data is available in a transferable format already as xml, but much isn’t.

I like this development, not because it’s particularly technically radical – it isn’t – the innovation is in the concept and the way it’s deployed, but particularly because it extends a lot of what have up-until-now been standards and platforms for the developer to develop with and applied these same principles to the end-end user, you, me, my mum in a way that is enormously beneficial and yet quite humble, quite ‘ordinary’. As Ray states:

Clearly
as the flexibility and potential of “mashing up” and recombining
application components gets closer to someone who understands the
user’s needs, the value to that user increases.

Damn right.  And you can’t get much cloer to the user than the user themselves.

They’ve released the code as an attribution share-alike licence, which of course they’d have to in order to get people to actually use and develop with it and make it a standard to give it currency.  Nice one Microsoft – standing up for the ordinary folk and innovating a move from proprietary apps [and it goes some way toward forgiving Ray for giving us Lotus Notes]

An example of Live Clipboard to play with

Walk throughs of Live Clipboard