13th European Conference on eGovernment – ECEG 2013 1 | Page 355

Ronald Meijer et al.
Next, OD may conflict with security, the principal goal in which the government research institute of our case is operating. Identity disclosure from survey or administrative data might be used by private or public groups to target or harm individuals, population subgroups, or business enterprises( Gutmann et al. 2008). Privacy of civilians thus needs to be thoroughly protected. Civilians expect that public organizations follow rules and procedures carefully in order to protect them and their privacy. Civilians that are harmed due to incorrectly followed rules and procedures may cause social unrest. Therefore, measures to enforce that rules and procedures are followed correctly should be taken into account while developing infrastructures for data sharing in the public domain( Braak et al. 2012).
Thirdly, OD may conflict with transparency, via the intermediary element information overload. In the literature we find that information overload occurs when information received becomes a hindrance rather than becoming potentially useful( Bawden 1999). Information overload is related to the quantity and diversity of information available( Bawden 2009). We therefore argue that as governmental organizations possess large volumes of data about many subjects the opening up of this data may cause information overload.
Finally, OD may have negative effects on trust via the intermediary element validity and reliability of the results in cases where the data is re‐used. This may concern re‐use on the basis of the data provided but also on the basis of extension of data with other( open) data sources. We argue that as data are opened the governmental control on reliability and validity decreases due to a possible lack of a proper interpretation. Third parties may use the opened data in ways that weaken these elements. Data from administrative databases might for instance be misinterpreted and misused with the stigmatization of groups as a consequence( Kalidien et al. 2010).
3.3 Precommitment
We have found several contradictions between the OD values. Consequently, to succeed, OD policy has to reconcile these values. We argue that constraint imposed on the access to data is important for trust: in deciding how far one party needs to trust the other and vice versa. We argue that precommitment is necessary to bridge the contradictions in OD.
Precommitment is a restriction of one’ s choices( Elster 2000)( Kurth‐Nelson and Redish 2012). It is implying constraint. Individuals might benefit from having specific options unavailable, available only with a delay, or at greater cost. Precommitment may be aimed at overcoming impulsivity( e. g. in gambling machines require the gambler to pre‐set a limit on his or her expenditure, after which the machine deactivates( Kurth‐Nelson and Redish 2012). Precommitment is also theorized as a device whereby we can impose some restraint on ourselves and thus restrict the extent to which others have to worry about our trustworthiness( Gambetta et al. 2000).
We conceptualize precommitment as a policy‐instrument whereby an organization – in case responsible for OD ‐ imposes some restraint on its policy in order to restrict the extent to which values may conflict and stakeholders have to worry about the trustworthiness of that policy. To limit possible conflicts between OD values several restraining options for opening data are possible. In the first place privacy sensitive data might be irrevocably deleted. Datasets may be completely anonymized before they are archived, thus limiting the future possibilities to replicate research or to link the datasets to other data to create new datasets. Secondly, data including privacy sensitive attributes might be opened for specific goals or target groups only. Some data( e. g. registry databases of police and prosecution) might only be distributed for specific scientific purposes and to scientific institutes only – in compliance with privacy laws and regulations in vigor. Thirdly, before the opening of data from research to public, all privacy sensitive data needs to be thoroughly removed. As a result the opened datasets will contain only limited information and can be analyzed only in a( very) restricted way. Finally, data can be made accessible in an indirect way by the provision of exclusively highly aggregated data only. These data is generated by data experts.
The following case illustrates how precommitment is implemented in the open data policy of a government research institute operating in the field of security and justice.
333