Archive for January, 2012

One corner of manufacturing industry

January 19, 2012

So much has been written about the British economy’s move away from manufacturing, over the decades, that it’s easy to overlook the extent to which the UK has become remarkably successful as a hub for car manufacture.  There’s an interesting piece from the Guardian which discusses how this has arisen, and which rather brings out the international flavour of the whole sector: note that Bentley, while a traditionally British name and a manufacturer of very exclusive vehicles, is German owned and ascribes some of its recent success to adopting Japanese approaches to production.

Thoughts on the ICT curriculum

January 19, 2012

Information technology has always been a subject which can be taught in different ways: in universities we have rationalised this a little by distinguishing between information systems, which is very much a branch of management, and information technology, which is about the nuts and bolts of making systems work.

This has been brought into the public eye by Michael Gove’s recent comments on the information and communications curriculum in schools, and his intention to change it to include programming skills.  Mr Gove, as a politician, likes controversial statements, so I should point out that anything that I write here doesn’t represent approval or otherwise of his views in general, nor does it represent the views of my employer.

Still, on the comment that school ICT lessons need to go beyond just learning to use packages such as Word and Excel does seem reasonable, with three important caveats.  One is that a lot of ICT teaching does already go way beyond these packages (this piece by an ICT teacher makes interesting reading on the subject) and, as with much discussion of teaching in schools, it shouldn’t be judged either by a few examples of unispiring practice or, worse, by uninformed views of what happens in the classroom.  One is that, as in every subject, teachers shouldn’t lose sight of the need to ensure that children are familiar with basic skills.  And one is that the skills that are important now – even pure technical and programming skills – are different from the ones that will be needed in ten year’s time.  To be fair, I think the proposals do recognise this as a reason not to be over-prescriptive in what should be taught.

Similarly, the idea of a curriculum wiki is a sound one but with one particular caveat.  It’s great to get a range of opinion on what ICT education needs to offer, and a wiki could be an effective way to achieve this, but like many instances of social media, it needs to be mediated carefully.  If it means employers from Silicon roundabout discussing what they would like from schools, it’s great.  If it’s dominated by people with prejudiced ideas or narrow agendas, then it’s just an opportunity wasted

John Sculley, Moore’s law, and the legacy of Steve Jobs

January 19, 2012

John Sculley, whose tenure at Apple was associated with a rather unhappy period when Apple’s computers looked like everybody else’s, has been talking about his experiences.  It makes for a striking contrast with the popular narrative, which pits Steve Jobs the radical thinker against Sculley the traditional corporate operator,  Perhaps most significant is the reference in the article to Moore’s law – to the effect that computing power gets cheaper and cheaper all the time.  It’s important in the Apple context because one of the key things that Apple has achieved over the years has been to recognise when, given the effect of Moore’s law, it’s right to innovate.  Apple introduced the first Macintosh when Moore’s law made it feasible to use computing power to drive easy-to-use, graphical interfaces, and introduced the iPod when it became possible to store significant quantities of music in computer memory on one device.  So you could argue that spotting the right time for innovation, and noticing what’s made possible, is actually one of the drivers for something else that Apple has done well: the ability to move almost seamlessly from a computer company to a music device company to a phone company, but to keep some sort of common thread within the business.

So much has been written, in the months since the death of Steve Jobs, about the sort of business that he constructed, that I have little to add.  But recognition of Moore’s law and of the opportunities made possible by cheapening technology is one strand in the common thread.  Another is the importance of design (the point that Apple products don’t look like everybody else’s. and also the reason that Jonathan Ive, as designer, has such an important role in the business).  Yet another is the ability to balance innovation – the creation of radically new products – with adaptation and the continuing improvement of existing products. 

For all his commitment to changing the way that people live and work, Steve Jobs does appear to have been a fairly ruthless corporate operator, focused on the success of his own business, and part of this stemmed from adopting ideas which had originated with other people.  Famously, the graphical interface of the first Macintosh computers, and the Lisa before it, came from work done by Doug Englebart and others many years earlier.  Nevertheless Apple deserves credit for identifying when this technology was ready for exploitation, and that’s an ability that shouldn’t be underestimated

Will disruptive innovation arrive in higher education?

January 13, 2012

A significant amount of my teaching about management of technology and innovation draws on the ideas of Clayton Christensen, and the concept of disruptive technologies.  These are innovations which can radically change the structure of a business and threaten the most powerful established players.  In recent years the widespread use of the smartphone is a particularly visible instance of a disruptive innovation changing the structure of an industry, with a very different set of players dominating the smartphone market from those who were most powerful in the traditional market.

It’s important to realise that although disruptive innovation has occured in many places over the years, it’s not a template for every change which is made possible by the introduction of new technology.  This week’s Times Higher Education reports that Christensen has now turned his attention to higher education.  Now the university sector is significant so far because, despite immense use of new technology and changes in the way that learning material is delivered, we haven’t seen huge changes in the structure of the sector.  The specialised players who work with IT and distance learning include the Open University here in the UK, which has been around since the 1960s and has seamlessly evolved from its use of television broadcasts after closedown, to today’s elaborate use of the Internet.  Conversely attempts to create completely online universities in the early 2000s, such as UNext which included the usability expert Donald Norman among its founders, remained as specialist players.

Of course, given that I’m employed by a traditional university, I have some interest in the current landscape not being unduly disrupted – at least until I retire!

Christensen and his collaborators suggest that, in the mainstream of the American university system which which they are most familiar, disruption will be driven by rising costs.  This is a contrast to other areas – for instance in the phone market disruption was driven by falling costs and by a sense that the trajectory of innovation (to use a word from Christensen’s writing) in products from the likes of Nokia and SonyEricsson was levelling off.  In any case, this is food for thought…

Banking in the cloud

January 12, 2012

The BBC has also been reporting this major deal for the Spanish bank BBV to use Google’s services.  It’s notable that their account of the deal stresses the division between internal communication, where mobility is important and security of customer data less so, and the systems that support core banking activities and that do store customer data, and suggests that Google’s responsibility is for the former type of system.

Easy photography from another era

January 12, 2012

My previous post, about digital photography, and also the Kenneth Grange exhibition last year led me to this about a Kodak easy-to-use camera from around 1970.  I’m amazed that there is, or at least was in 2008, even one supplier makimg film for these even if it is expensive.  Incidentally I’m prepared to be told I’m wrong, but I think that unless you had the optional flashcube attachment, everything in this camera was mechanical and nothing was electric or electronic

Innovation and the digital camera

January 12, 2012

There’s an interesting BBC piece at http://www.bbc.co.uk/news/magazine-16483509 on the emergence of the digital camera – and the development now that the simple camera could yet be made obsolete by the cameraphone.  One notable point alluded to in the piece is the steady improvement in the technology for automatic focusing.  However that had an effect in the last twenty years at least of widespread use of film cameras – and was a consequence of the cheapening of electronic components.  From the 1980s that led to the emergence of a whole range of powerful, compact, and user-friendly film cameras which simplified the process of taking good quality photos.  From a business strategy perspective they filled a gap between very simple point-and-shoot cameras, and much more elaborate SLR (single lens reflex) cameras.  But so long as cameras used a storage medium which could only carry 36 pictures at a time, and where you couldn’t see the pictures until you’d taken the film to be developed, these improvements didn’t change the nature of photography in the way that digital cameras have done so.  In fact the ability to take lots of pictures, and to select the one which works best, and the ability to touch up pictures using photoshop or the equivalent, are both examples of things that professional photographers have always done, but are now available to almost everybody.

New year, new telepresence

January 7, 2012

This piece from America’s National Public Radio covers the use of robots to simulate the effect of somebody being present in a meeting.  I’ve often thought that this sort of thing offers something closer to real telepresence (creating the sense of physically being in a different place) than the sort of high-quality videoconferencing approaches which are more often marketed as telepresence.  Incidentally I know that there’s a tendency to use almost any noun as a verb, but I’d never come across the usage about half way down the piece that a robot receptionist is operated by somebody who ‘remotes into the robots’