Change search
ReferencesLink to record
Permanent link

Direct link
Evaluation of the user interface of the BLAST annotation tool
Linköping University, Department of Computer and Information Science. Linköping University, The Institute of Technology.
2012 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

In general, annotations are a type of notes that are made on text while reading by highlighting or underlining. Marking of text is considered as error annotations in a machine translation system. Error annotations give information about the translation error classification.

The main focus of this thesis was to evaluate the graphical user interface of an annotation tool called BLAST, which can be used to perform human error analysis for any language from any machine translation system. The primary  intended use of BLAST is for annotation of translation errors.

Evaluation of BLAST mainly focuses on identification of usability issues, understandability and proposal of redesign to overcome issues of usability. By allowing the subjects to explore BLAST, the usage and performance of the tool are observed and later explained.

In this usability study, five participants were involved and they were requested to perform user tasks designed to evaluate the usability of tool. Based on the user tasks required data is collected. Data collection methodology included interviews, observation and questionnaire. Collected data were analyzed both using quantitative and qualitative approaches.

The Participant’s technical knowledge and interest to experiment new interface shows the impact on the evaluation of the tool. The problems faced by individuals while evaluating was found and solutions to overcome those problems were learned.

So finally a redesign proposal for BLAST was an approach to overcome the problems. I proposed few designs addressing the issues found in designing the interface. Designs can be adapted to the existing system or can be implemented new. There is also a chance of doing an evaluation study on interface designs proposed.

Place, publisher, year, edition, pages
2012. , 80 p.
Keyword [en]
Usability, Evaluation, Annotation, Error analysis, Machine translation
National Category
Computer Science
URN: urn:nbn:se:liu:diva-82339ISRN: LIU-IDA/LITH-EX-A--12/039--SEOAI: diva2:558051
Subject / course
Master's programme in Computer Science
Available from: 2012-10-01 Created: 2012-10-01 Last updated: 2012-10-01Bibliographically approved

Open Access in DiVA

fulltext(2137 kB)547 downloads
File information
File name FULLTEXT01.pdfFile size 2137 kBChecksum SHA-512
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Kondapalli, Vamshi Prakash
By organisation
Department of Computer and Information ScienceThe Institute of Technology
Computer Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 547 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 100 hits
ReferencesLink to record
Permanent link

Direct link