What I’ve been reading

I find this stuff so that you don’t have to.

You can find all my bookmarks on Pinboard.

What I’ve been reading

I find this stuff so that you don’t have to.

You can find all my bookmarks on Pinboard.

Evaluating digital engagement

A record number of people took part in our live web chat on evaluating online engagement.

Whilst in the course of the hour we didn’t exactly solve the issues, the discussion threw up some great examples and ideas and everyone who took part will have taken something useful away with them.

You can relive the chat using the CoverItLive widget on the Kind of Digital website, or download a rather basically formatted PDF transcript.

Hope it’s useful!

News of the next live web chat shortly…

Webchatting: barriers and evaluation

Yesterday’s live Kind of Digital web chat about overcoming barriers to implementing social media went really well.

You can relive the whole thing over on the KoD website, or download a rather basic PDF of the transcript.

Our next chat has been scheduled in for Tuesday, August 16th at 11am BST. It’s going to be on the thorny subject of evaluation of digital engagement activity – great!

You can sign up for a reminder over here – hope you see you there!

Bookmarks for December 12th through December 30th

I find this stuff so that you don’t have to.

You can find all my bookmarks on Delicious. There is also even more stuff on my shared Google Reader page.

You can also see all the videos I think are worth watching at my video scrapbook.

Bookmarks for April 5th through April 10th

I find this stuff so that you don’t have to.

  • Social Media Security – "We have found a huge lack of accurate information around security issues and awareness of social media. This website aims to help educate users of social media of the threats, risks and privacy concerns that go with using them."
  • E-government is not a financial cure-all – "Whoever is in charge after 6 May, I expect the drive towards "smarter government" (or whatever catch phrase replaces it) to continue. There are simply no other tools in the box. But whoever is in charge will avidly wish someone had made a bolder start while the going was good."
  • bantApp.com: Bant Diabetes Monitoring App for the iPhone and iPod Touch – Interesting iphone app for diabetes management, via @robertbrook
  • Two models of open innovation – "Based on our recent experience of working on open innovation projects, and also building upon a great paper by Kevin Boudreau and Karim Lakhani, we have concluded that there are two distinct ways of doing open innovation – creating competitive markets or collaborative communities"
  • Let government screw up – "I have the opportunity to speak to groups across government about the benefits, challenges and potential costs of social media. In the face of institutional anxiety, I’ve argued that social media is a positive environment that encourages experimentation. In fact, online users are willing to accept mis-steps and stumbles from government organizati0ns simply because it demonstrates initiative and ambition, if not expertise."
  • Project Spaces: A Format for Surfacing New Projects – home – "The event format I'm calling Project Spaces has emerged from working with various collaborators to facilitate events for communities actively engaged and committed to finding better ways to do things."
  • Can Open Office Escape From Under A Cloud? – "I do see a future for Open Office in the enterprise — one that’s closely tied to integration with collaboration, content management, and business processes and facilitated by the likes of Oracle and IBM."
  • A democratic view of social media behaviours – Interesting action research post from Catherine – plenty to chew on here.
  • Digital exclusion, porn and games – "I wonder if – as with mobile phones – there’s a certain, influential generation that see the technology as being more than just a technology. And instead, a marker for a whole way of life they just haven’t accepted yet."
  • Social media measurement – Great stuff from Stuart Bruce – debunking a few myths and some marketing BS.

You can find all my bookmarks on Delicious. There is also even more stuff on my shared Google Reader page.

You can also see all the videos I think are worth watching at my video scrapbook.

Evaluating online engagement

I’ve mentioned before that we all really need to start evaluating the online engagement stuff we’re all doing. Alice Casey‘s presentation provides some great pointers for where to start and what to consider:

My main argument was that a good evaluation tells a compelling story through combining qualitative and quantitative information in a clear format to key decision makers and practitioners.

The importance of evaluation

Stephen Hale at the FCO has an excellent, interesting and important post about measuring the success of the London G20 Summit site.

With wonderful openness and transparency, Stephen has set out some of the factors by which the site’s success could be measured, along with the results. Its fascinating reading, and provides lots of lessons for anyone approaching an engagement project like this.

Indeed, this ties in with Steph’s recent (and overly-modest) post about the achievements of the engagement bods at DIUS over the last year or so. He wrote:

We still haven’t nailed some of the basics like evaluation, [or] the business case

Figuring out whether or not something has actually worked is terrifically important, and the long term efficacy of online engagement relies on this nut being cracked.

Stephen’s post highlighted some really good practice here: outline what your project aims to do, and come up with some measures around it so you can work out whether it worked or not.

As Steph mentions, having an up-front business case is really important – a written down formulation of what the project actually is and what it ought to achieve.

Now, business cases and evaluation criteria can be developed in isolation and in a project-by-project basis. I wonder, though, how much more value could be created by developing a ‘package’ of evaluation which could be used as a foundation by everyone involved in government online engagement?

Of course, each project has its own unique things that will need to be measured and tested, but surely there are some basic things that every evaluation exercise would need to look at?

How about some common evaluation documents were created, and that every project undertaken ensured that the basic, common stuff was recorded, as well as the unique bits. That way, some kind of comparative analysis would be possible, especially if everyone submitted their results into a common database.

Just how hard would it be to come up with a common framework for online engagement projects? I think it is worth a shot.