Change search
Refine search result
1 - 22 of 22
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1. Bjurulf, S.
    et al.
    Vedung, Evert
    Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Social Sciences, Institute for Housing and Urban Research.
    Larsson, C. G. G.
    A triangulation approach to impact evaluation2013In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 19, no 1, p. 56-73Article in journal (Refereed)
    Abstract [en]

    This article presents a viable method to overcome the challenge of producing reliable cause-effect findings in impact evaluation. The Measuring Cluster Effects through Triangulation method (MCET) involves methodological triangulation. Three designs - shadow controls, generic controls, and process tracing - are combined to shed light on causality. When these three approaches are triangulated, cause-effect findings will be more reliable. The MCET combination is a feasible alternative when randomized controlled trials and matched controls are impossible or impracticable. It is also an alternative to using a single non-experimental design, particularly in situations where expenditure is great and the causality issue is pressing. In this article, the MCET approach is illustrated by information drawn from a set of evaluations performed on the activities of the Compare Foundation, a cluster organization in Sweden in the Information and Communication Technology sector. Regionally based in Karlstad, County Värmland, and founded in 2000, the Compare cluster organization has adapted the MCET to its own activities.

  • 2.
    Bjurulf, Staffan
    et al.
    Karlstad University, Faculty of Arts and Social Sciences (starting 2013), Service Research Center. Region Varmland, SE-65115 Karlstad, Sweden..
    Vedung, Evert
    Uppsala Univ, Uppsala, Sweden.
    Larsson, C. G.
    A triangulation approach to impact evaluation2013In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 19, no 1, p. 56-73Article in journal (Refereed)
    Abstract [en]

    This article presents a viable method to overcome the challenge of producing reliable cause-effect findings in impact evaluation. The Measuring Cluster Effects through Triangulation method (MCET) involves methodological triangulation. Three designs - shadow controls, generic controls, and process tracing - are combined to shed light on causality. When these three approaches are triangulated, cause-effect findings will be more reliable. The MCET combination is a feasible alternative when randomized controlled trials and matched controls are impossible or impracticable. It is also an alternative to using a single non-experimental design, particularly in situations where expenditure is great and the causality issue is pressing. In this article, the MCET approach is illustrated by information drawn from a set of evaluations performed on the activities of the Compare Foundation, a cluster organization in Sweden in the Information and Communication Technology sector. Regionally based in Karlstad, County Varmland, and founded in 2000, the Compare cluster organization has adapted the MCET to its own activities.

  • 3.
    Carlsson, Lars
    Luleå University of Technology, Department of Business Administration, Technology and Social Sciences, Social Sciences.
    Non-hierarchical evaluation of policy2000In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 6, no 2, p. 201-216Article in journal (Refereed)
    Abstract [en]

    An important task for policy evaluation is to develop methods that are based on the fact that political power is fragmented and that every policy area is complex. This article demonstrates, using an empirical example, how different strands of the policymaking process are related to different logics of evaluation. Also discussed is how these differences may result in quite opposite conclusions about the possible failure or success of single programmes. However, it is concluded that policy research does not have to abandon the idea of rationality and adopt a more postmodern or hermeneutic line of analysis. Policy evaluation is still, it is argued, a matter of finding relevant units of analysis, and in contemporary society these units are networks rather than political-administrative entities. Thus, in order to be able to scrutinize and understand such processes of policy creation, policy evaluation must adopt a non-hierarchical attitude and this requires a bottom-up methodology.

  • 4. Castro, Maria Pia
    et al.
    Fragapane, Stefania
    Rinaldi, Francesco Mazzeo
    KTH. University of Catania, Italy.
    Professionalization and evaluation: A European analysis in the digital era2016In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 22, no 4, p. 489-507Article in journal (Refereed)
    Abstract [en]

    It is expected that the number of evaluators will continue to grow in the near future. However, the heterogeneity of different national contexts makes the consolidation of a consistent jurisdiction' for the professional evaluator rather problematic. This article contributes to the debate on the professionalization of evaluators by looking at practices attributed, competences and skills required by employers, and the main topics addressed by the community of evaluators. The authors draw on various sources - ISCO08 (International Standard Classification of Occupation); ESCO (European Skills, Competences, Qualifications and Occupations); job offers posted on the EES (European Evaluation Society) website; EES LinkedIn group - to argue that the practice of evaluation has achieved a supranational dimension, with potential consequences both on evaluators' educational profile and on the ways in which evaluations are commissioned and conducted.

  • 5.
    Denvall, Verner
    et al.
    Lund University.
    Linde, Stig
    Lunds Universitet.
    Knocking on heaven`s door: The evaluation community goes to church2013In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 19, no 4, p. 431-441Article in journal (Refereed)
    Abstract [en]

    Ideas and concepts about evaluation travel around the globe. Studies of how evaluation models are disseminated, diffused and implemented are important. In this article, we examine an organization with a history of traditions and legitimacy and a successful audit of its own and how it responds to modern concepts of administration where evaluation plays an important role. Based on an analytical framework from organizational theory, we show how an evaluation model has been either adopted, rejected or transformed depending on the local context and consider why central policies have had limited success in its implementation. This article should contribute to a better understanding of the transformation that evaluation undergoes in the journey between and within organizations.

  • 6.
    Fjellström, Mona
    Umeå University, Faculty of Social Sciences, Department of Education.
    A Learner-Focused Evaluation Strategy: Developing Medical Education through a Deliberative Dialogue with Stakeholders2008In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 14, no 1, p. 91-106Article in journal (Refereed)
  • 7.
    Hanberger, Anders
    Umeå University, Faculty of Social Sciences, Umeå Centre for Evaluation Research (UCER).
    Evaluation of and for democracy2006In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 12, no 1, p. 17-37Article in journal (Refereed)
    Abstract [en]

    There are many options for elaborating democratic evaluations. This article discusses evaluation of and for democracy, and in particular three broad democratic evaluation orientations: elitist democratic evaluation (EDE), participatory democratic evaluation (PDE) and discursive democratic evaluation (DDE). The archetypes differ regarding, for example, evaluation focus, inclusion of stakeholders, dialogue and the role of the evaluator. The three orientations promote certain democratic values and are linked to the elitist, participatory or discursive notions of democracy respectively. It is argued that there is a need to become more conscious of how evaluations not labelled democratic can influence democracy and what responsibility democratic evaluators have. If commissioners and evaluators become more aware of the different democratic orientations evaluations may have, they will be better able to decide which evaluation to commission and undertake.

  • 8.
    Hanberger, Anders
    Umeå University, Faculty of Social Sciences, Department of applied educational science, Umeå Centre for Evaluation Research (UCER).
    Multicultural awareness in evaluation: dilemmas and challenges2010In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 16, no 2, p. 177-191Article in journal (Refereed)
    Abstract [en]

    The purpose of this article is to discuss what is meant by multicultural competence in evaluation and how policies and programmes aiming at multicultural awareness and ‘validity’ can be evaluated. The article discusses three main ways of understanding multiculturalism and how multicultural competence in evaluation can be defined. It also develops evaluation criteria that can be used for assessing the multicultural implications of policies and programmes. The article suggests that a multiculturally competent evaluator should be well informed about minority and majority norms and also familiar with different models of multiculturalism. The multiculturally aware evaluator employs an appreciative approach to traditional cultures that is consistent with human rights and international law. A multiculturally relevant evaluation should stimulate a discussion that facilitates inter-cultural understanding and multicultural awareness. Developing multicultural awareness in evaluation can be seen as a way of developing democratic evaluation.

  • 9.
    Hanberger, Anders
    Umeå University, Faculty of Social Sciences, Department of applied educational science, Umeå Centre for Evaluation Research (UCER).
    Rethinking democratic evaluation for a polarised and mediatised society2018In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 24, no 4, p. 382-399Article in journal (Refereed)
    Abstract [en]

    This article discusses how democratic evaluation can manage threats to democracy, democratic renewal, and the mediatisation of public policy and governance. It considers the readiness of five democratic evaluation orientations to deal with current threats and discusses how to develop them. It demonstrates that democratic evaluation is poorly prepared to manage current threats to democracy or the mediatisation of public policy. Progressive evaluation is the only approach offering some new keys to addressing certain current threats and challenges. The other orientations have some capacity to manage threats to democracy and support democratic renewal, but need further development. The article suggests that democratic evaluation could be a constructive tool for maintaining and developing democracy in an increasingly polarised and mediatised society if evaluators gain knowledge of threats to democracy, democratic transition,and democratic renewal and, informed by mediatisation and democracy research, develop the necessary awareness and competence to deal with these challenges.

  • 10.
    Hanberger, Anders
    Umeå University, Faculty of Social Sciences, Department of applied educational science.
    The real functions of evaluation and response systems2011In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 17, no 4, p. 327-349Article in journal (Refereed)
    Abstract [en]

    This article is intended to contribute to the understanding of evaluation use in the context of policy making and governance. It does this, first, by developing a framework that sets out the prerequisites of six possible functions of an evaluation ‘management response system’; and second, by analysing three aid organizations’ management response systems in relation to this framework. The prerequisites of the different functions are theoretically derived. The analysis finds that response systems have contributed to organizational legitimacy and achieved many of their intended functions, but only to an extent. Two factors – the system design, and top managers’ support for the system – were found to be critical for how these systems worked. The article also discusses whether a management response system meets the needs of public organizations and stakeholders operating in multi-actor policy making.

  • 11.
    Hanberger, Anders
    Umeå University, Faculty of Social Sciences, Umeå Centre for Evaluation Research (UCER).
    What is the policy problem?: Methodological challenges in policy evaluation2001In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 7, no 1, p. 45-62Article in journal (Refereed)
    Abstract [en]

    Because when a policy process starts, nobody knows what line of action will eventually be implemented, policy evaluation has to continuously examine the content of different policy components. In order to understand and explain public policy, different stakeholders’ perceptions of the policy problem need to be scrutinized. A policy evaluation should also facilitate the interpretation of policy in a broader context. What values and order does the policy or programme promote? Using an open evaluation framework and a mix of criteria can facilitate a broader interpretation of the policy process. In this article, problems undertaking policy evaluation are discussed in relation to a Swedish medical informatics programme.

  • 12.
    Hanberger, Anders
    et al.
    Umeå University, Faculty of Social Sciences, Umeå Centre for Evaluation Research (UCER).
    Schild, Ingrid
    Umeå University, Faculty of Social Sciences, Department of Sociology.
    Strategies to evaluate a university–industry knowledge-exchange programme2004In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 10, no 4, p. 475-492Article in journal (Refereed)
    Abstract [en]

    This article discusses different approaches to evaluating a knowledgeexchange programme designed to foster closer university–industry interaction. It shows how the same policy or programme can be understood and thus evaluated in a number of ways. First, a particular knowledgeexchange programme evaluated by the authors is described. The article then outlines four different methodologies that can be used to evaluate the programme, and assesses it from the perspective of each, outlining the respective strengths and weaknesses of the approaches. The first two evaluation approaches, programme theory evaluation and outcome analysis, tend to be applied in ways that privilege the policy/programme makers’ worldview, and in this sense may be considered ‘management-oriented’ approaches; the second two approaches, policy discourse analysis and qualitative network analysis, are often applied in ways that incorporate a critical stance to this worldview, and may in this sense be considered ‘non-management-oriented’ approaches. The validity of a policy or programme evaluation can be enhanced by adopting a multi-methodological design incorporating both types of approach. Stakeholders are more likely to learn from a programme/policy evaluation and to be receptive to its conclusions if their differing perspectives and success criteria are incorporated into the evaluation.

  • 13.
    Hertting, Nils
    et al.
    Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Social Sciences, Institute for Housing and Urban Research. Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Social Sciences, Department of Government.
    Vedung, Evert
    Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Social Sciences, Institute for Housing and Urban Research.
    Purposes and criteria in network governance evaluation: How far does standard evaluation vocabulary takes us?2012In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 18, no 1, p. 25-44Article in journal (Refereed)
    Abstract [en]

    Evaluation and network governance are both among the top-10 trendy concepts in public policy. But how are they related? In the present article, we ask how public sector interventions guided by a network governance doctrine are to be evaluated. If evaluation means systematic judgment of organization, content, administration, outputs and effects in public policy, then evaluators need concepts and analytical tools to assess these features and communicate their analyses. In the literature, interest in network modes of governance often goes together with a call for a renewed vocabulary for evaluation and policy analysis. In the article, we do not take this to be a fact. Instead we turn it into a question: How relevant and productive are established concepts and tools of evaluation theory for evaluating network governance? More specifically, we address the issues of purposes and merit criteria in evaluation of interventions fashioned according to the network governance doctrine. Though it takes some elaboration, our overall conclusion is that at least some standard evaluation concepts and approaches are still productive in delineating, analysing and prescribing how network governance can be evaluated. There are crucial accountability issues to raise, the goal-achievement criterion is not irrelevant and the meaning of stakeholder evaluation is elucidated when confronted with the ideas of the network governance doctrine.

  • 14.
    Karlsson, Ove
    Mälardalen University, Department of Social Sciences.
    CRITICAL DIALOGUE: ITS VALUE AND MEANING2001In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 7, no 2, p. 211-227Article in journal (Refereed)
    Abstract [en]

    Today dialogue is a frequent used idea in the discourse of evaluation. Dialogue stands for an ambi-tion to involve different stakeholder in open and power-free exchange of opinions and ideas about what is evaluated. The aim with this article is to give an example on how to manage dialogue in practice. An evaluation case study is used to illustrate how the evaluators manage a dialogue in different phases of the evaluation process. To handle the sometimes-difficult situation with many different views on the subject, the evaluator developed different, sometimes new and innovative methods to make the exchange of ideas possible. The article shows that dialogue in evaluation need to be adjustable to different situations and need among the participants.

  • 15.
    Karlsson, Per-Åke
    et al.
    University of Borås, School of Health Science.
    Beijer, Elisabeth
    Eriksson, Bengt
    Leissner, Tom
    Evaluation Workshops for Capacity Building in Welfare Work: Some Swedish Examples2008In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 14, no 4, p. 477-491Article in journal (Refereed)
    Abstract [en]

    Ever increasing demands are being made on welfare organizations to display efficiency. Evaluation workshops constitute a form of learning for the purposes of building up competence to conduct evaluations within welfare organizations, with the support of research and development units. In workshops of this kind, welfare work professionals meet in order to conduct evaluations together with researchers/professional evaluators.This article presents experiences from 10 such evaluation workshops conducted in western Sweden.The workshops were perceived very positively by the participants. While the evaluations are being conducted, the participants also develop a more general competence in this field.The evaluations conducted at the workshops are primarily internal, but with external support, with all the limitations this involves in relation to the possibilities for critical scrutiny. Evaluation workshops have a beneficial effect on the learning of evaluation methods by directly combining learning and conducting evaluation. The workshops may also serve to build capacity in the organizations for evaluative work.

  • 16.
    Khakee, Abdul
    Umeå University, Faculty of Social Sciences, Department of applied educational science, Umeå Centre for Evaluation Research (UCER).
    The emerging gap between evaluation research and practice2003In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 9, no 3, p. 340-352Article in journal (Refereed)
    Abstract [en]

    While evaluation practice is still disposed towards rational quantitative methods, evaluation research has increasingly utilized qualitative dialogical methods. In this article, the increasing gap between practice and research is examined by analysing how evaluation research has evolved from within three perspectives: a policy programme perspective, a welfare economics perspective and a planning theory perspective. The article also discusses the implications of the emerging gap between evaluation research and practice.

  • 17.
    Larsson, Magnus
    et al.
    Umeå University, Faculty of Social Sciences, Department of Sociology. Umeå University, Faculty of Social Sciences, Department of applied educational science, Umeå Centre for Evaluation Research (UCER).
    Hanberger, Anders
    Umeå University, Faculty of Social Sciences, Department of applied educational science, Umeå Centre for Evaluation Research (UCER).
    Evaluation in management by objectives: a critical analysis of Sweden’s national environmental quality objectives system2016In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 22, no 2, p. 190-208Article in journal (Refereed)
    Abstract [en]

    This article investigates what can be achieved by one particular management by objectives system – i.e. the Swedish National Environmental Quality Objectives system and its evaluation function. A critical programme theory analysis is first developed to reconstruct the programme theory of the National Environmental Quality Objectives system. Next, the robustness of the programme theory is analysed in terms of internal consistency, theoretical support and empirical support. The results indicate that, while some assumptions underlying the National Environmental Quality Objectives system are valid, the National Environmental Quality Objectives’ programme theory has low validity in several respects. The evaluative knowledge the system produces is only partly relevant to or useable by local actors and industry. While the state of the environment is observed and measured in the monitoring and evaluation reports, the direct effects of environmental policy and work are not evaluated, which is a main weakness of the National Environmental Quality Objectives system. It is unlikely that the current evaluation function can effectively support achievement of the National Environmental Quality Objectives environmental objectives. The article suggests that evaluations in support of network governance are more likely to support National Environmental Quality Objectives achievement and sustainable development.

  • 18.
    Nordesjö, Kettil
    Linnaeus University, Faculty of Social Sciences, Department of Social Work.
    Made in Sweden: The translation of a European evaluation approach2019In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 25, no 2, p. 189-206Article in journal (Refereed)
    Abstract [en]

    To understand how evaluation approaches change between contexts, they need to be studied in relation to their social, cultural, organizational and political contexts. The aim of the article is to describe and analyse how the European Union evaluation approach, ongoing evaluation, was translated in Swedish public administration. A case study shows how institutional entrepreneurs promote their evaluation norms of participatory evaluation and attach evaluation to a less dominant governance logic in the Swedish evaluation field. This raises questions about the role of the evaluator, evaluation terminology, and the unclear and weak borders of the evaluation field where evaluation approaches can be launched and translated with relative ease.

  • 19.
    Segerholm, Christina
    et al.
    Mid Sweden University, Faculty of Educational Sciences, Department of Education.
    Åström, Eva
    Governance through Institutionalized Evaluation: Recentralization and Influences at Local Levels in Higher Education in Sweden2007In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 13, no 1, p. 48-67Article in journal (Refereed)
    Abstract [en]

    Public sectors in Europe and elsewhere are the subject of regular, recurrent and systematic evaluations. Evaluations have become institutionalized. This article examines the influence of institutionalized evaluation in higher education in Sweden. In this decentralized education system several kinds of effects of evaluation are detected locally, i.e. at whole-university and at department levels. Through an evaluation process characterized by self-evaluation, external reviews and public reports produced by the Swedish National Agency for Higher Education, all universities are examined. Governance operates by making universities visible, and thereby promoting comparison and competition, control and self-control. Centrally defined criteria are implemented in the process, via direct contact between the National Agency and university departments, leading to the recentralization of power. Strategies to deal with these evaluations at departmental level are developed and unintended influences, like learning resistance strategies, are highlighted.

  • 20.
    Suárez-Herrera, José Carlos
    et al.
    Université de Montréal.
    Springett, Jane
    Liverpool John Moores University.
    Kagan, Carolyn
    Manchester Metropolitan University.
    Critical connections between participatory evaluation and organizational change dynamics2009In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 15, no 3, p. 321-342Article in journal (Refereed)
    Abstract [en]

    The current debate around the emergence of participatory approaches in evaluation practice suggests that participatory evaluation may be considered an organizational learning praxis, one which facilitates the development of a holistic process of intentional change. Through critical reflection on how participatory evaluation has been conceptualized, this article offers an overview of some of the contextual challenges encountered when using participatory evaluation to enable the creation of learning environments. Given the pluralistic nature of modern organizations and some contextual constraints, evaluators appear to have largely developed a more instrumental type of learning, which may, paradoxically, result in a significant source of resistance to intentional change. This article proposes a process of capacity building for evaluative research (CBER). This process offers a collaborative way of overcoming unforeseen resistance to intentional change by overcoming the challenges found in the relationship between participatory evaluation and organizational learning. The article concludes by suggesting some epistemological and organizational issues that evaluators should take into account when enabling the implementation of a process of CBER in pluralistic organizations.

  • 21.
    Vedung, Evert
    Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Social Sciences, Institute for Housing and Urban Research.
    Four Waves of Evaluation2010In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 16, no 3, p. 263-277Article in journal (Refereed)
    Abstract [en]

    This article investigates the dissemination of evaluation as it appears from a Swedish and to a lesser extent an Atlantic vantage point since 1960. Four waves have deposited sediments, which form present-day evaluative activities. The scientific wave entailed that academics should test, through two-group experimentation, appropriate means to reach externally set, admittedly subjective, goals. Public decision-makers were then supposed to roll out the most effective means. Faith in scientific evaluation eroded in the early 1970s. It has since been argued that evaluation should be participatory and non-experimental, with information being elicited from users, operators, managers and other stakeholders through discussions. In this way, the dialogue-oriented wave entered the scene. Then the neo-liberal wave from around 1980 pushed for market orientation. Deregulation, privatization, contracting-out, efficiency and customer influence became key phrases. Evaluation as accountability, value for money and customer satisfaction was recommended. Under the slogan ‘What matters is what works’ the evidence-based wave implies a renaissance for scientific experimentation.

  • 22.
    Vedung, Evert
    Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Social Sciences, Institute for Housing and Urban Research.
    Recension av The Handbook of Environmental Policy Evaluation by Ann Crabbé och Pieter Leroy, London and Sterling, VA.: Earthscan, 20082009In: Evaluation, ISSN 1356-3890, E-ISSN 1461-7153, Vol. 15, no 4, p. 483-486Article, book review (Other academic)
1 - 22 of 22
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf