Bjelkeman's travel notes

Travels with the cloud in my pocket.

openaid.se, Swedish development aid transparency

This was originally posted on the Open for Change blog

openaid.se screenshot

Today I attended the launch of the new aid transparency effort, openaid.se, which is a joint effort between the Swedish Ministry of Foreign Affairs and the Swedish International Development Cooperation Agency (SIDA), to show where Swedish government development aid money is going. The Swedish minister for Development Cooperation, Gunilla Carlsson, presented the effort and went into some depth to describe the work.

I together with Akvo was asked by the Swedish Foreign Ministry to review and give feedback on the openaid.se site before the launch. I was also part of a review panel which discussed the work after the presentation together with a very engaged audience.

I think openaid.se is a very good effort to start showing the Swedish aid budget. The team working on this clearly were very passionate about this work and has put in a lot of effort bringing both budgets and thousands of documents visible online. We would like to commend everyone involved on a great start.

openaid-tv picture

To see the video of the launch event, click the above picture. The panel, which I was part of starts at 34 minuts in.

Read the rest of this entry »

Filed under: Development aid, Open source

Heroes: Becky Straw at The AdVenture Project

Adventure Project World Water Day

The Adventure Project was co-founded by Beck Straw, a friend and hero of mine. The Adventure Project is a non-profit organization established last year to increase investments in positive social enterprises around the world.

During for World Water Day they pledged, with the help of a whole bunch of bloggers, to help raise money for a particular water project. I promised to help them, but I was sick and didn’t manage to get my pen out yesterday. But I will write about them anyway, as I really like what they do.

Becky used to work for Charity: Water before she started the Adventure Project. I met her in Istanbul at the World Water Forum, where she presented at the “Thinking outside the water box session” the session before me.

The next thing I am going to do, is click on the banner above, and donate money to the project they are supporting in India. The money is used to train and employ handpump mechanics. And if you like me, have been in India, then you know there is a big pent-up demand for that. Lots of broken hand pumps.

To wrap it up, here is a short video which @charmermark filmed of Becky, when she worked with Charity: Water, together with Ian Thorpe, then at PumpAid.

[blip.tv ?posts_id=2759081&dest=-1]

Filed under: Crowd-sourcing, Heroes

Why more nuclear power does not make any sense

Nuclear power plant Mochovce

Picture by Michal Brcak.

I write this without holding any illusions that anyone will actually read this, nor do I expect to convert anyone. I just need to get it out of my system. So there you have it.

Safety of operating plants

According to Guardian Data there are 442 operating nuclear power plants in the world. On average they have been in operation for 26 years. [1] There are also some 66 reactors which have been shut down, or decommissioned for some reason. In total these plants have operated a little over 13,000 years together. [2]
The International Atomic Energy Authority, ranks nuclear accidents, on a scaled called INES, from 1-7 (anomaly to major accident) and rank 4-7 are classified as accidents. Again, according to Guardian Data, the accidents with wider consequences (5-7) have been six.

1952, Chalk River, Canada, INES 5
1957, Windscale Pile, UK, INES 5
1957, Kyshtym, Russia, INES 6
1979, Three Mile Island, 1979, INES 5
1986, Chernobyl, INES 7
2011, Fukashima, INES 5

In total (according to the Guardian) there have been 33 recorded serious incidents and accidents involving nuclear power. So with 13,300 operating years, we have had one serious accident or incident per 405 operating year, and a level 5-7 accident (like the ongoing Japanese accident) every 2230 operating year.

With 442 plants in operation, if we follow the same accident frequency, we will have serious incident or accident every year. We will also have an accident of the level of Fukashima every five years.

It could be argued that things are getting safer, as we have only had three big accidents since 1957. But half of the incidents recorded by Guardian Data have happened since Chernobyl.

According to World Nuclear Association (WNA) there are 62 nuclear power plants under construction, 158 on order or being planned and a further 324 plants are proposed. The WNA also suggests that at least 60 plants of the current operating, will shut down by 2030. Which would leave us with 926 nuclear power plants operating.

If the failure rate stays the same we would have a nuclear power plant incident every five months and a INES 5-7 incident every two and a half years.

Why would the failure rate stay the same?

There is a clear track record of safety failure in the nuclear industry in several countries. Here are a couple of examples:

“The unfolding disaster at the Fukushima nuclear plant follows decades of falsified safety reports, fatal accidents and underestimated earthquake risk in Japan’s atomic power industry.” Bloomberg, 18 March 2011.

“State-owned Swedish energy concern Vattenfall has admitted serious security deficiencies at its controversial Forsmark nuclear power plant.” Power-Gen Worldwide, 12 February 2007

“Between 1950 and 2000 there have been 21 serious incidents or accidents involving some off-site radiological releases that merited a rating on the International Nuclear Event Scale, one at level 5, five at level 4 and fifteen at level 3.” Sellafield article, Wikipedia.

If countries like Japan, Sweden and the UK can not make its nuclear power operators follow safety protocols, where do you expect it to work better?

But there are other reason why we should question nuclear power.

Safety of storage

As a trained geologist I actually think spent nuclear fuel storage can be solved reasonably well. However, essentially nobody wants it in their backyard and nobody has actually started storing spent fuel yet.

“Finland plans to have a long-term waste repository operational in 2020, Sweden in 2023 and France in 2025.”

In Scandinavia we have relatively good and stable granite bedrock to store spent fuel in, but where is the rest of the waste going to go from 440-900 nuclear power stations? Maybe some poor country with good bedrock will become the the nuclear waste dump of the world. Sounds great.

“A draft EU directive presented on Wednesday calls for national plans to be drawn up in the next few years, as the EU still has no final storage sites for nuclear waste.” BBC News, 3 November 2010.

All the spent nuclear fuel waste in the world is currently in short term storage. Like the storage which may be causing trouble in Japan at the moment.

“The Nuclear Regulatory Commission estimates that many of the nuclear power plants in the United States will be out of room in their spent fuel pools by 2015, most likely requiring the use of temporary storage of some kind.” US Nuclear Regulatory Commission

Great idea.

Nuclear proliferation

More nuclear power plants mean more nuclear weapons. The ongoing debacle with Iran and a state barely in control of itself, Pakistan, and one on the brink of collapse, North Korea, with nuclear weapons, I believe, is just the beginning of nuclear proliferation, if we keep depending on nuclear power for our energy needs.

At some point nuclear weapons will be used. If the attackers of 9/11 had had access to a nuclear weapon, do you think they would have refrained from using it?

Complexity of nuclear power

If you invest a lot of money in more nuclear power plants, you can’t take any of that and give to a family in a failed state, like Somalia, to help fix there power shortages. But if you rather invest it in cheap solar power, like Nanosolar or First Solar, you can even sell a Somalian a power plant, at the family level, without a major risk to them, their surrounding or the environment, and it is simple enough for even my old grand mother to operate.

The majority of the increased power need in the world is in the countries which are not well developed and it would be foolish to believe that we could help them by building and operating a nuclear power plant. In fact you can’t run an operate a nuclear power plant unless you have sophisticated infrastructure, in the shape of a functional government, national administration, education and technology, so it is no real help for the developing world.

That Pakistan and North Korea has nuclear power is irrelevant in this context, as they only have it to produce weapons grade plutonium. The importance of nuclear energy for them is less than secondary.

Uranium mining

Uranium mining is one of the nastiest businesses in the whole mining industry. The environmental impact is big. We used to mine uranium in Sweden, but this was discontinued, we now like the rest of Europe, buy our uranium from other countries, such as Australia, where the mines are in the outback, out of sight, out of mind.

Nuclear is CO2 free

Whilst it is true that an operating nuclear power plant doesn’t emit much CO2, it does when you take the whole lifecycle into account: mining uranium, building and decommissioning the plant.

“However, nuclear emits twice as much carbon as solar photovoltaic, at 32 gCO2e/kWh, and six times as much as onshore wind farms, at 10 gCO2e/kWh. “A number in the 60s puts it well below natural gas, oil, coal and even clean-coal technologies. On the other hand, things like energy efficiency, and some of the cheaper renewables are a factor of six better. So for every dollar you spend on nuclear, you could have saved five or six times as much carbon with efficiency, or wind farms,” Nuclear energy, assessing the emissions, Nature, 24 September 2008.

Baseload

You will sometimes hear the term baseload and also hear that nuclear power plants are needed to provide baseload power. Baseload is what people call the power we need “regardless of whether the sun shines or the wind blows”.

An overview of why this is wrong can be read at Do we need nuclear and coal plants for baseload power? by David Roberts and a more detailed description of why, you can be read Amory Lovins, Four Nuclear Myths, Rocky Mountain Institute, 13 October 2009.

Peak uranium

Something often overlooked is that there may not be as much uranium around at the required price, as the nuclear industry would like. My take is that there is probably enough fuel for the 900 plants, which the maximum expected by the nuclear industry over the next 30 years, specifically as the fuel isn’t a significant cost of the building of a plant, i.e. a nuclear power plant is relatively price insensitive to higher nuclear fuel prices. More at Uranium Depletion and Nuclear Power: Are We at Peak Uranium?”, The Oil Drum, 21 March 2007.

Cost of nuclear power

“The Union of Concerned Scientists recently reported that nuclear subsidies total nearly 7 cents per kWh, twice what a typical wind power plant receives and similar to the federal incentives offered for solar power.” Nuclear Power, Still not viable without subsidies, Union of Concerned Scientists, February 2011 [PDF file]

This article at The Grist, is a good overview: Cost, not Japan crisis, should scrub nuclear power. Specifically you should note the following quote:

“In the time it would take to build a nuclear plant (6-8 years, optimistically), every commercial energy technology could produce electricity for less.”

In other words, the cost of building energy systems on wind power, solar, biofuel, small scale hydro and other renewable energy systems will most likely have caught up with nuclear before you can complete a new nuclear power plant.

In the UK the nuclear fuel industry refuses to build any new plants without huge government loan guarantees.

Fourth generation nuclear power plants

Another argument which often comes up, is that the next generation of nuclear power plants will “improve nuclear safety, improve proliferation resistance, minimize waste and natural resource utilization, and to decrease the cost to build and run such plants.”

But these supposedly improved nuclear power plant designs are paper tigers.

Other than one design, which could theoretically be available in the first implementation during the mid-2020s, are just research projects today and could earliest enter production during the 2030s. And if you have studied any climate science at all, you know that pouring billions into uncertain, centralized, expensive, nuclear power station projects, is not what we need right now. Essentially no new nuclear power plant is ever delivered on budget. These new research projects are bound to cost a lot more than what is presented right now. (If you can find any estimates at all. I didn’t.)

The severe difficulties of Finland’s Olkiluoto nuclear reactor being built by Areva SA, the French state-owned nuclear construction firm, provide a reminder of how these problems unfold. Touted as the turnkey project to replace the aging cohort of nuclear reactors, the project has fallen three years behind schedule and more than 50% over budget. The delay has caused the sponsors of the project to face the problem of purchasing expensive replacement power; the costs of which they are trying to recover from the reactor builder. The cost overruns and the cost of replacement power could more than double the cost of the reactor.” The economics of nuclear reactors: Renaissance or relapse?, Mark Cooper, Senior fellow for Economic analysis, Institute for energy and the environment, Vermont law school, June 2009

And that is not even a fourth generation design.

Another example of this type of argument was sent to me yesterday: “On energy and the end of civilization“, Warren D. Smith, 2001. Where the author lays out an argument for showing us that fossil fuels will be too expensive to use in the next 30-70 years, including uranium, excluding coal. (Note that this was written before the understanding of peak fossil fuel was as it is today, where we actually have hit peak oil, but that is a different blog post, one day.) Then he argues that solar wont work, as it is too hard, and the solution is … breeder reactors.

These nuclear reactors enable the use of U-238 (converted by neutron irradiation into fissile Pu-239) and Th-232 (converted to fissile U-233) as fuel, not just the (far rarer) U-235. This will enable energy production at current rates for 1000s of years using only known reserves of Thorium and Uranium.
Breeder reactors work. One was in large scale commercial use in France… only problem is: “in June 1997 France said it would scrap their highly controversial $4.7B Superphenix nuclear fast-breeder, saying it was too costly and of doubtful value.” A French govt report in 1996 concluded it had cost the state $12B. The planned shutdown in 2005 will cost $20B more. This was the world’s largest fast-breeder but it had managed to operate for only 6 months through 1997 since it began generating power in 1985. Oops. France’s electricity is 80% nuclear due to French leadership thinking it had no other choice.
There had been a major sodium leak at Superphenix in 1987 but it had re-begun operating in 1994 after a 4-year layoff. Britain simiarly had closed its Dounreay fast-breeder in 1995. The US operated an experimental fast breeder at Shippingport Atomic Power Station in the 1970s and early 1980s. The reactor had a core that was designed to produce Uranium-233 from Thorium-232. Although it showed no signs of ending its useful life, the experiment was ended due to budgetary concerns and interest in analyzing the core to see if breeding had occurred. When analyzed, the core indeed contained 1.3% more fuel than it had originally contained.
Japan in Dec 1995 shut down their Monju fast breeder, which took 12 years and $4.91 billion to build, after a massive coolant sodium (very flammable!) leak. There was a furor over cover-ups of the incident with doctored videos and incomplete reports.

Breeder reactors are also interesting as it is supposedly easier to produce weapons grade plutonium in them. The person who sent this to me says this is our last hope.

Sounds great doesn’t it?

Conclusion

The nuclear lobby thinks we need to overlook the faults of nuclear power. They want us to accept regular catastrophic failure, nuclear weapons proliferation, the unsolved problem of final spent fuel storage, the fact that investing in nuclear power doesn’t help the worlds 2 billion poor, despite that it emits more CO2 than renewables and it really messes up the environment when you mine uranium.

They want us to invest in nuclear power because “There Is No Alternative” and they argue that nuclear power is a cost effective solution. But it isn’t.

So what is left of the argument? Nothing.


Notes:

[1] There were three plants which didn’t have a start date for the operations, so I gave them the average operating lifetime. I calculated the start time by deducting the start year from todays year, i.e. 2011-Start year=Years in operation. Median for the number of years in operation is also 26.

[2] I assume the decommissioned plants had operated for 26 years, as I have no data for their operations. I have also ignored the time when the plants are down for maintenance.

Edit: Changed the title to “Why more nuclear power doesn’t make any sense” from “Why nuclear power doesn’t make any sense” as this was more in line with my intent of the article. I am not of the opinion that we should decommission nuclear power plants before their end-of-life, to replace them with fossil fuel power plants.

Edit 2: Added the section on fourth generation nuclear power plants.

Filed under: Climate Change, Facts, Social and economic policy

Law is hard. Code is harder. Why new internet and software architecture will define the future of society

From left: Lawrence Lessig, Vinay Gupta, Srikant Nadhamuni. Picture of VInay by @charmermar, the other two by me.

Something which Vinay Gupta said the other day brought together several strands in my head. Vinay called it Foreign Policy by Internet Protocol. It is short enough to be quoted in full:

Foreign Policy by Internet Protocol
1. 5.1 billion cell phones, soon to be 7 billion smart phones on 3G networks
2. increasingly valuable services delivered over international borders, like Google
3. global shared knowledge bases like wikipedia or satellite maps
4. telemedicine, tele-engineering, micro-consultancy, social media and so on as the tools spread into new areas of life

Non-state actors conducting FPIP include WikiLeaks, Appropedia and many other groups. Currently it’s not at all clear that any state has begun to effectively deliver FPIP.

Vinay Gupta, Foreign Policy by Internet Protocol (2011) [1]

If you combine the thought that our communications infrastructure is going to start dictate how we think about the world with what Laurence Lessig says: “The Code is the Law”. Then a number of things which are going on in the world today can be seen in a very different perspective than what you see in your average newspaper opinion piece.

Of course, Lessig was years ahead of me thinking about this, in his piece The Code is the Law from 1999 he says [2]:

“The single most significant change in the politics of cyberspace is the coming of age of this simple idea: The code is law. The architectures of cyberspace are as important as the law in defining and defeating the liberties of the Net.”
Lawrence Lessig, The Code is the Law (1999)

The Code is the Law

Consider the example of copying of copyrighted works. You break a multitude of rules and laws if you copy a copyrighted work. Some countries are trying to implement some pretty draconian laws to stop copying over the internet, like the three strikes and your are cut off laws [3], which are met with quite a lot of resistance at the moment. But that hasn’t really stopped anyone from actually breaking these laws. The flow of information over peer-to-peer (P2P) networks is increasing and new laws seem to have short term effect on peoples behavior [4].

Google holds billions of images on their giant server farms with caches of images from web sites. According to the letter of the law they are breaking the copyright law when doing that. YouTube’s HTML 5 trials made it possible to download every video on YouTube to your computer (they seem to have disabled that again) and there is an enormous amount of material which breaks the copyright laws and rules on YouTube. There are some ongoing big lawsuits against Google, who owns YouTube, but in essence, for most people and companies YouTube is more useful than it is a threat, despite what the law says. Add to this that the function of the internet requires that you make a copy of a web page or a picture to actually view it on your computer, and it is trivial to copy it from the web browser cache to save it for later.

In short, the architecture of the internet has a stronger influence on how people behave than what the law says, as long as the majority of the people see a significant benefit.

The extension of this is that software architecture starts defining how our society behaves. Furthermore, I think that internet architects and coders who build useful systems, may in the long run, have a bigger influence on our future society than politicians and the traditional power-brokers have. Why do I believe this?

Read the rest of this entry »

Filed under: India, ITC technology, Social and economic policy

Identifying more than a billion Indians, another take on Gov 2.0

Image: Gireesh G V for Forbes India

Srikant Nadhamuni, tech lead for the Indian UID project. Image: Gireesh G V for Forbes India

The Indian UID project is very interesting to me, as the work they are doing is done on an enormous scale. There are other systems which reach this scale, and arguably are more complex than this (Facebook for example), but it is still impressive.

“By 2014, the government wants half of India’s population to be allotted UID numbers. To do that, the Authority will photograph a staggering 600 million Indians, scan 1.2 billion irises, collect six billion fingerprints and record 600 million addresses.”

Read more in this rather good Forbes India article. Another article about this was published on the Economist yesterday (although together with my friend Gabriel I am still pondering what the 14 billion transactions per second actually mean).

Whilst a country like Sweden, where I live, is struggling with a hodge-podge of identification services to be used online as well as offline, India isn’t only going to launch an online system of staggering scale, it is also going to leapfrog our old systems in a giant leap. Once they are up to speed with issuing IDs they could issue biometric IDs to the Swedish population in just over a week. At peak they expect to issue 1 million IDs per day.

Srikanth and my wife Anke taking a break during the bicycle ride on the outskirts of Bengaluru, buying some coconuts from a street vendor. January 2010.

A friend of mine, Srikanth Nadhamuni, leads the technical development from the Indian government side and it is really rather interesting to talk about the implications for this system with him.

One aspect which doesn’t get much coverage is that they are going to use the UID system to facilitate very inexpensive money transfers for people. This is in a country where a lot of people, maybe even most of them (hundreds of millions of people) don’t actually have a bank account at all today.

Another aspect which is interesting is that the team started the development in a way which would be very familiar to many Hacker News readers. They worked out of an apartment in Bangalore, where several team members lived as well as worked, in a true startup atmosphere. Software companies, like MS and Goggle would show up with teams and end up sitting around the kitchen table or on the spare bench from the hallway to participate in sessions where the project was being discussed.

They have software volunteers, expat-Indians, coming in from all over the world to work on the project, the top level people behave just like any other software startup entrepreneur you would expect, sitting up to 4am in the morning doing code reviews, walking into a room and asking: How’s it going? Not the usual bureaucratic India you would expect.

If I wasn’t working on what I work on right now I would probably have been a volunteer on the project myself, if they would have had me that is. 🙂

Edit: I have written about the UID project before, but it was quite short.

Filed under: India, ITC technology, Social and economic policy

Governance is the last mile problem

Picture by Mark Charmer

Yesterday I had the privilege to spend several hours with Sunita Nadhamuni. We had a lot to talk about, as we hadn’t met since the summer. Sunita sits on the board of Akvo.org as well as running one of my favorite organisations in development aid, Arghyam, which means I am lucky enough to be able to book some time with her and make it seem legitimate.

As usual the the topics of discussion ranged far across the board, but what really made me grab for a notebook to quickly scribble down a quote was something she said when we were discussing fundamentals around development aid. Sunita said:

“Governance is the last mile problem.” – Sunita Nadhamuni

The last mile is an expression often used in the internet and telecommunications business when discussing how to get people connected to telephone or internet services. There often is sophisticated communications infrastructure available locally, but often no money, or rather a perception that it is too expensive to get everybody hooked up. The investment in the required “last mile” connection is often unpalatable, but without it there is no point of building of the infrastructure in the first place. It may be easiest to understand the challenge economically for a northerner, like myself, when looking at it in an example: The single biggest cost in transporting food to your table is not where one would expect it to be, to the supermarket, but from the supermarket to your home, i.e. the last mile. 1

But back to development aid. In our discussions yesterday we noted that in the segment of development aid in which we work, water and sanitation, there seem to be a particular challenge in getting these services deployed on the ground related to the local situation. It doesn’t matter if the national or state government sets goals, understands the problem and sends out decrees, if there is no capacity locally to both understanding the problem and to understand how to approach solving it. The most successful efforts around solving water and sanitation problems (and I believe this applies to education, healthcare and other areas as well), are when you manage to engage the local community to the point where the local community not only understands the problem, but owns the solution. It doesn’t matter how many NGOs there are who works with the issue in a country like India, or anywhere else for that matter, if you can’t successfully get the local community to engage with the problem. In countries or regions which have functioning water and sanitation systems the solution nearly exclusively involves the local community and the local government.

In communities which do not have these services the main problem is not what technology to use, or how to build it, or who should be responsible or own it, but a matter of getting people to sit down together and discuss the issue and working together to solve the problem. It is nearly always a matter of governance.

At university I spent four years studying environmental problems and water related issues, but I only had three (3!) days learning about governance. When discussing water and sanitation issues there seem to be no end to the discussions about what technology to use, is access to clean water a human right or not, and government policy on the subject. But good examples to learn from how to make it work locally are harder to come by, or maybe harder to share, as the context in a successful solution is often what I would call “hyper-local”. In India, central government actually seems well aware of that the solution should local, but until now it seems to have had a hard time translating it into action. This may be about to change. Arghyam is currently working with the Indian central government in reviewing the current progress of the five year plan and planning the next five years, and this time, possibly for the first time, there is organized feedback from the grass roots level. Hundreds of participants from the gram panchayat level of government (village council) are participating and collaborating with other participants from state and national level to give feedback on the central government plans.

Technically we know what to do. The money is there to do it. The challenge is to engage people in an open discussion to make it happen. It is democracy. Governance is the last mile problem in water and sanitation. When you make that work the rest is easy.


Footnotes

1. The Validity of Food Miles as an Indicator of Sustainable Development, DEFRA, July 2005

Edited: 13 January 2011, fixed a spelling error. Thanks to @PraveenaSridhar for finding it.

Filed under: Arghyam, Development aid, India

Where open data leads us

This piece was originally posted on the FutureGov blog.

Akvo RSR

Data is the lifeblood of any decision maker. To make good decisions, you need to understand what is going on. If you make decisions money is super data: not only does money tell you if your customers like what you do, but it also indicates whether the projects and departments are running well. Are you over budget? Do you have money to operate next quarter? The problem is that you can’t easily measure everything in money. Did 18 schools get built instead of 23? Is this good or bad? What if the schools that got built were in the poorest area because of changes in priority? In this case, the end result of building fewer schools was actually better. The story attached to the data enables us to interpret the raw data.

Akvo Foundation, of which I am a co-founder and director, was recently awarded a contract by the Dutch government to operate and enhance Akvo Really Simple Reporting (RSR). Akvo RSR makes it easier to see the work done by development aid organisations, opening their projects up online and helping them share status updates as they happen, via the web and mobile phones. In other words Akvo are providing the ability to show the data, as well as the story behind the data. We will operate and enhance Akvo RSR as an online service for five years, as part of a development aid programme by the Dutch Ministry of Foreign Affairs, called MFS II.

Akvo is a non-profit foundation, which we operate on the basis of “not for profit, not for loss”. We have a business model and charge for some of our services. The online service that we develop and operate is nearly all based on open source software – and Akvo RSR, which is our core service, is published as such, too.

What is new here is that a small non-profit, open source, foundation is running a small but growing part of the Dutch government’s information technology infrastructure. MFS II in total is worth about 2.1 billion Euro, while the pieces we provide online reporting for are “only” worth 100 million Euro, less than 5% of the total. But we are already talking with many of the different NGOs which manage the rest of these development aid funds about extending their use of Akvo RSR, and if reactions from others around the world, including several governments, in the last couple of months are anything to go by, this is going to be big.

Will anyone get fired for buying open source?

There used to be a joke that “nobody ever got fired for buying IBM”. Indeed, 10 to 15 years ago, the scenario I describe to you today would have been inconceivable. Back then an online services contract like this would have gone to a very different organisation, like IBM, and it would have been built on a “proprietary” platform with the data locked away. There is a really interesting trend here where many governments have realised that democracy is better served with open data and open platforms, and that you can run this on open source software. The most discussed examples of this are data.gov in the US and data.gov.uk in the UK. But the trend is discernable in many places, from Norway to Argentina – just see the make-up of the Working Group on Open Government Data hosted by the Open Knowledge Foundation.

In our case the Dutch government is paying for Akvo to not only build and operate this open data service, but also leaves it up to our entrepreneurial selves to define exactly how the service should function, and I would argue that this is both progressive and bold. They could have demanded that we deliver information in some particular way, but they didn’t, and I predict they will be really happy with the end result.

A strong reason to not specify how the data should be delivered is that the potential for innovation in this field is very high right now, and deciding how things must look in five years would be an exercise in futility. Consider that sites like YouTube (2005), Facebook (2004) and Twitter (2006) have gone from zero to hundreds of millions of users in the same timeframe. No one, least of all their founders, could have conceived what they look like today.

A 21st century safety valve

The fact that Akvo RSR is open source software has acted as a potential safety valve for our government investors. They understand that open source software is better to have than proprietary closed software when the government is betting on a startup, as the end result can be reused easily if things go off-course. But I am now convinced that this is last decade’s battle. Open source guru Tim O’Reilly has long argued that the new frontier in the computer industry is not about open source software, but about open data.

The issue in the future isn’t about whether you have your data in a proprietary software system, but whether you can share it with others openly. So called data lock-in is a real danger. (Not that essentially all our NGO partners aren’t struggling with proprietary data systems in which they have a lot of data, which they can’t migrate from easily.)

Some of the most interesting uses of open data in recent years have come from non-traditional sources. 25,000 people got involved in reviewing the expenses of UK members of parliament after the Guardian newspaper and website provided a user interface for the data.

I hear a number of people at government level in the UK express thoughts that they can cure the headache of how to make government more transparent, if we just open up the data. But I think this is false. You can’t replace a bureau of statistics by an army of volunteers who trawl open government data for fun and produce information that makes sense and that the government needs to run a country. A few cases will titillate the public into participating, but you would be extremely foolish to base your data analysis strategy on it. But that doesn’t mean that government shouldn’t build open data systems. After all, it is the people’s government and now, when it finally is economically feasible to do so, the information that the government uses to perform its work really should be available to all of us (with reasonable exceptions with regards to privacy and safety).

Conclusions

To best serve democracy I suggest that not only should civil society functions, like development aid, use open data interchange standards like the International Aid Transparency Initiative (IATI)’s XML schema, but they should run them on open source software and provide open application programming interfaces (APIs) for data interchange.

I think it is foolish to believe that “if you open it they will build what you need on it, for free”. Government needs to open up data infrastructure, but also needs to invest money to improve, innovate around and maintain this infrastructure. Consider looking to outsiders for many of these functions, as old hierarchical institutions are notoriously bad at innovation. But government must also have good technologists advising the decision makers, or you are in for a world of hurt, like the UK government’s failed IT health system, scrapped at a cost of 6 billion pounds.

I think that the future government communication infrastructure will be immensely more complex than what we have today, with a plethora of services available, some offered by non-profits, some by companies and some by governmental organisations. Some will even be offered by individuals.

There will be a challenge to keep the services open, open in a way which allows you to move the data and the services to another service provider, without losing out on the way.

Thomas Bjelkeman-Pettersson is co-founder and chief technology officer of Akvo.

Filed under: Uncategorized

Video conferencing for better virtual organisations

A few of the Akvo team using EVO during a meeting.

I work in a virtual organisation, called Akvo Foundation. The Economist describes virtual organisations as having “an almost infinite variety of structures, all of them fluid and changing”. Some virtual organisations have no or few employees, others have no offices, premises or physical assets, some are not even formal organisations but groups of organisations or people working together. Akvo technically has no office and no staff, but we are 12+ people who work for Akvo and some of us sit in offices. (I did describe how this works in some detail in an earlier post.) This can be somewhat of a challenge at times, especially when you can’t see each other face to face.

Several in the Akvo team have worked together previously in other organisations. Ten years ago we had people in San Francisco and Stockholm at times, and we used AXIS web cameras and speakerphones to get something which resembled a video conference. The hardware was relatively expensive and the result was not great. But it made a difference to be able to see the people you were talking to, despite the video quality being really poor most of the time. In fact, it made such an impact on us that I have been hunting for the perfect “virtual office video wall” tools ever since. With a vision of having a video wall which constantly shows the other offices.

Since we started Akvo in 2007 we have been using iChat and Skype for video conferencing. iChat when you need 2-4 computers hoooked up and Skype when you only want to do two computers. Our experience with this is somewhat mixed. iChat, which we use with AIM accounts (as we don’t have to pay for them) often has trouble with the central servers for setting up standard text chats, but we feel that the user interface for the chat client is better, than say Skype, so we persist in using it. iChat often also has trouble setting up video connections, which I think is related to the uPnP services in the routers we are using. It is not surprising that Apple seems to have decided to retire the iChat software and replace it with Facetime instead, which is now available on both the iPhone/iPad/iPod Tourch and in beta on the Mac. Skype tends to be more reliable, but alas, is only available on the Mac for two party conferencing today. (Skype is beta testing multi-part video calls at the moment, but alas, it is only on Windows.)

So on and off I have been looking at a number of different alternatives, most which were not really very attractive for a small non-profit like ours. Many video conferencing systems cost an arm and a leg and require dedicated hardware, like those from Cisco and Tandberg. Others want a monthly services fee for a service which often isn’t better than Apple’s iChat, which is free.

One intriguing alternative was AccessGrid, which is open source software used by a number of academic institutions for large scale video conference solutions. However, AccesGrid isn’t really suitable for us, as you need multi-cast capability at every node to really take advantage of the tools AccessGrid offers, which the ISPs we use don’t offer. There are multi-cast bridges you can use, and recent versions of AccessGrid have unicast tools, but they are not ideal. Add to this that the AccessGrid development for Mac OS X is rather slow to put out functional versions, which is our main desktop/laptop OS at Akvo, this turned out to not be an attractive solution.

Recently we have been using EVO from Caltech. EVO is free, runs on a number of different operating systems (it is developed in Java) and can do multiple party video conferencing. I don’t actually know of how many, but we have been connecting up to seven nodes so far. Echo cancellation for audio in EVO isn’t great, so we are using Skype for the audio part of a meeting. EVO has a text chat function too, but we didn’t like it much. In fact, we are using iChat for the text back-channel, Skype for the audio and EVO for the video section. It is kind of messy, but it works. Every morning one of us creates a new private “room” in EVO for Akvo (they last until midnight unless you extend them for 24 hours, but then they go away). We log in with three dedicated computers every morning. One in the Akvo Hague office, one in my home office and one in the Akvo London office. Each computer has dedicated monitor for the video conference. Anyone else in the team can check in at any time, either just looking or also turning on their camera as well. It is a bit like someone coming up to the meeting room you are in and looking through the door.

It actually makes a big difference to our virtual office environment, where I can just glance over at the big screen, which I have at one end of my office, and see if the Dutch contingent has gone for lunch yet or if they are still working. It changes the dynamic how I interact with my colleagues a lot, in a positive way. It is easier to understand that someone is busy and doing good work, when you can actually see them. However, I don’t think it is the same for all of us. Generally our programmers could have EVO on and running most of the time, but they tend to only use it during our weekly meeting. I also turn the big monitor off quite a lot, even though I leave the camera running, as a way to not be distracted by what is going on in my peripheral vision when I am working on something.

But overall, adding a permanent video feed to our virtual office has been a positive step forward. If you have colleagues you work with a lot, but would like to see more often, I would recommend that you experiment with it. The only thing you need is a computer with a webcam (any recently new Mac has one, and you can buy one for a PC for 30 Euro). EVO is free, for now.


Flattr this

Filed under: ITC technology

Heroes – Joop van Lenteren

[blip.tv ?posts_id=1819811&dest=-1]

My brother in law, Joop van Lenteren, has over the last twenty years taught me many things regarding sustainable living, particularly with regards to agriculture. He has served as a mentor for me when I needed to move away from the rat race and help me find a place in a sustainable bigger picture.

Besides being a great person and my brother in law, Joop is a Professor of Entomology at Wagening University and is one of the worlds most respected entomologists in the area of biological control of pests and diseases. He has won multiple science prices, including: AkzoNobel Science Award (1982), Koninklijke/Shell Prijs (2005) and Rank Prize for Nutrition (2006).

Some of his most significant scientific papers are:

Lenteren, J.C. van, 1990. Insects, Man and the Environment: Who will survive? In” Environmental Concerns: An Inter-disciplinary Exercise”, J.Aa. Hansen (ed.), Elsevier, London : 191-210.

Lenteren, J.C. van, 2000. A greenhouse without pesticides: fact of fantasy? Crop Protection 19:375-384.

Lenteren, J.C. van, Bale, J., Bigler, F, Hokkanen, H.M.T., Loomans, A.J.M., 2006. Assessing risks of releasing exotic biological control agents of arthropod pests. Annual Review of Entomology, 51: 609-634. + supplemental material.

Filed under: Food production, Heroes

Reflektioner på #SSWC 11 dagar senare

Some reflections on the un-conference Swedish Social Web Camp #SSWC, in Swedish.

[blip.tv ?posts_id=4069641&dest=-1]

Första frågan var: “Vad var #SSWC för dig?” och Anna Hass (@glimra) börjar, följt av Roman Pixell (@d0pp13r) och till sist Jennifer Bark (@thejennie). Vi satt tillsammans med ett gäng andra på lite post #SSWC snack.

SSWC-Jesus

För övrigt så måste jag säga att min favoritbild från #SSWC 2010 är “Jesus på berget” eller “SSWC Jesus”: Roman Pixell (@d0pp13r) i kimono, predikande visa ord om hur du får din nättjänst att stå ut i mängden. Bilden innehåller allt man behöver veta om SSWC 2010: det var avslappnat nog att man kunde hålla en session in kimono utan att ses som ett freak, eken, sommar, deltagandet av alla närvarande och naturen.

Filed under: SSWC

About Bjelkeman

thomas@bjelkeman.com

Co-founder/director: Akvo Foundation

+46-8-626 7609

More about Thomas Bjelkeman-Pettersson

@bjelkeman on Twitter

Flickr Photos

More Photos