Application of Crowdsourcing in User Experience collection – a case study of Malayalam mobile applications

Malathi Sivasankara Pillai (Department of Computer Applications, Cochin University of Science and Technology, Kochi, India)
Kannan Balakrishnan (Department of Computer Applications, Cochin University of Science and Technology, Kochi, India)

Rajagiri Management Journal

ISSN: 0972-9968

Article publication date: 29 June 2023

Issue publication date: 2 January 2024

603

Abstract

Purpose

This paper aims to prove the following hypothesis Problem Statement: HYPOTHESIS (1) User Experience collection of mobile applications can be done using the Crowdsourcing mechanism; (2) User Experience collection of mobile applications are influenced by the mindset of Crowdmembers, culture/ethnicity/social background, ease of interface use and rewards, among other factors.

Design/methodology/approach

The authors of this paper, did a literature review first to find if Crowdsourcing was applicable and a used method to solve problems in Software Engineering. This helped us to narrow down the application of Crowdsourcing to the Requirements Engineering-Usability (User Experience) collection. User experience collection of two Malayalam language-based mobile applications, AarogyaSetu and BevQ was done as the next step. Incorporating findings from Study I, another study using AarogyaSetu and Manglish was launched as Study II. The results from both cases were consolidated and analyzed. Significant concerns relating to expectations of Crowd members with User Experience collection were unraveled and the purpose of Study was accomplished.

Findings

(1) Crowdsourcing is and can be used in Software Engineering activities. (2) Crowd members have expectations (motivating factors) of User Interface and other elements that enable them to be an effective contributor. (3) An individual’s environment and mindset (character) are influential in him becoming a contributor in Crowdsourcing. (4) Culture and social practices of a region strongly affects the crowd-participating decision of an individual.

Originality/value

This is purely self-done work. The value of this research work is two-fold. Crowdsourcing is endorsed significant in Software Engineering tasks, especially in User Experience collection of mobile applications. Two, the Crowd service requesters can be careful about designing the questionnaire for Crowdsourcing. They have to be aware and prepared to meet the expectations of the Crowd. This can ensure the active participation of potential contributors. Future researchers can use the results of this work to base their research on similar purposes.

Keywords

Citation

Sivasankara Pillai, M. and Balakrishnan, K. (2024), "Application of Crowdsourcing in User Experience collection – a case study of Malayalam mobile applications", Rajagiri Management Journal, Vol. 18 No. 1, pp. 20-42. https://doi.org/10.1108/RAMJ-10-2022-0153

Publisher

:

Emerald Publishing Limited

Copyright © 2023, Malathi Sivasankara Pillai and Kannan Balakrishnan

License

Published in Rajagiri Management Journal. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


1. Introduction

Published work in Crowdsourcing defines Crowdsourcing as an open-call, free-to-choose mechanism that calls on individual contributors who are skillful, experienced and willing to contribute to a particular piece of work or service (Hosseini and Mahmoud, 2014; Estellés-Arolas et al., 2015; Kietzmann and Jan, 2017). This mechanism also involves controlling and rewarding participants of a work (Chandler and Mueller, 2013; Goh et al., 2017; Cappa et al., 2019). The group of participants is called a “crowd” and individual participants, “crowd members”. Crowdsourcing is a suitable option when organizations fail to find people with suitable skill sets or lack other resources like software, hardware, appropriate tools, people experienced with such kind of work, etc. within the organization. In most such cases, trying to gather resources will be much more expensive, will take time and not be a worthy solution to the problem. We made a thorough investigation of the literature and found that although there was much research done on applying Crowdsourcing in Software Engineering (Asiegbu Baldwin et al., 2017; Stol et al., 2017; LaToza and Van Der Hoek, 2015; Khan et al., 2021), especially user experience collection, there was no such work on the application of crowdsourcing in user experience collection for any Malayalam mobile application and this led to the work consolidated in this paper. Malayalam is an ancient Dravidian language, native of the state of Kerala (NIC for Government of Kerala, 2021b [online] https://www.kerala.gov.in by Naitional Informatics Centre (NIC) for Government of Kerala, access date 15/08/2021), India.

2. Methodology

The problem was viewed from a Software Engineering perspective. Requirements Engineering, the first phase of the Software Lifecycle model was considered.

  • Step 1: Exhaustive literature study was conducted on the application of Crowdsourcing in Software Engineering. The purpose of this step was to narrow down the focus to a specific area and specific application types.

  • Step 2: From the results of the study in Step 1, the focus was shifted to a more specific domain in Software Engineering. A detailed literature review was done on the thus revealed focus area and application type. From the results of this study, a clearer view of the need for an empirical study using Crowdsourcing in User Experience collection was obtained.

  • Step 3: Crowdsourced user experience collection of Malayalam Mobile Applications was done in two steps. A crowdsourcing questionnaire was prepared, crowd members identified, the questionnaire distributed and feedback collected from Crowdmembers. Applications considered were AarogyaSetu and BevQ Malayalam mobile applications. Result analysis was done.

  • Step 4: Crowdsourcing questionnaire was prepared, crowd members identified, questionnaire distributed and feedback collected from Crowd members for AarogyaSetu and Manglish applications. Result analysis was done.

  • Step 5: Consolidation of User Experience feedback was done, based on the results of both studies.

  • Step 6: Analysis of consolidated feedback was done and useful insights were obtained.

3. Methodology implementation

3.1 Literature review on the application of crowdsourcing in software engineering

A literature study was conducted on 30 publications on the topic. A tabular consolidation of the 12 most relevant papers on Crowdsourcing for Requirements-related aspects of Software Projects is given in Tables 1–3.

After collecting the details of work and challenges identified in applying Crowdsourcing to Software Engineering, the word cloud service of MonkeyLearn (MonkeyLearn Team, 2021[online] https://monkeylearn.com/word-cloud/access date 15/09/2021) was used to find out the most preferred work as well as the most referred challenges in the relevant published work in this area. The word with the maximum occurrence and significance was considered to be the most pivotal. The Monkey Learn plot of concerns addressed in the published literature on applying Crowdsourcing to Software Engineering is shown in Plate 1.

From the plot of concerns addressed in the published literature on applying Crowdsourcing to Software Engineering as in Plate 1, it is evident that the majority of the study was on applying Crowdsourcing to Software Requirements/Requirements Engineering.

MonkeyLearn plot of challenges in the context was done next to find existing problems that need to be addressed in implementing Crowdsourcing in Software Engineering. The plot is as in Plate 2 below:

The plot on challenges uncovered during various studies indicates that the challenge majority of the researchers faced in using Crowdsourcing with Software Engineering are collecting feedback from the Crowd and aligning/sequencing tasks or the process of Crowdsourcing.

With the vision obtained from the above-explained Literature review, the focus was narrowed down to Software Requirements Engineering. In the second step, a detailed literature review was conducted on the application of Crowdsourcing in the User experience collection of mobile applications. Also, a blunt search for such work with Malayalam mobile applications was done.

3.2 Literature review on the application of crowdsourcing in software requirements engineering/management

Step 2 was the study of literature on state-of-the-artwork application Crowdsourcing to Software Requirements Engineering/Management. The focus was narrowed down to software requirements based on results from the literature review detailed above. The scenario considered was that of mobile applications. This choice was made since a majority of such work was done with mobile applications. A sneak peek was also done at the work in this concerning Malayalam mobile applications. The consolidation of major work done in this area and relevant aspects are listed in Tables 4–6 below:

The column “Aspects covered” in Tables 1–6 above was plotted using MonkeyLearn. This was done to find the most significant/most occurring term in the set and this represents the concern addressed by the majority of published work in this area.

Plate 3 gives a clear indication that the focus was on the user and then on a prototype. From the literature, language was not a factor anywhere because a majority of the work we came across was in English. If at all a very few in other languages, they were negligible and didn’t have any remarkable contributions. The consolidation contained work in English Language only. A trace of Malayalam could not be found, however, till 2021.

Next, the MonkeyLearn plot was done in the “Challenges Mentioned” column of Table 4–6. This column contained challenges listed in existing works on the use of Crowdsourcing with User Experience collection of mobile applications. Plate 4 depicts the research work carried out in this regard.

Plate 4 is the plot of the challenges. The majority’s concern was related to crowd members and the interface given to the Crowdmembers for working with the request and contributing.

3.3 The outcome of the initial literature studies

Two sets of literature studies were conducted, one on Crowdsourcing for requirements-related aspects of Software Engineering and the other on the use of crowdsourcing in User Experience collection of mobile apps. Majority of the work across the software lifecycle phases focused on requirements and testing. In the process of the literature study, we came across the use of crowdsourcing relating to different aspects of a software project. Interestingly, we found that we never came across such attempts made on mobile applications in the Malayalam Language. From the initial two literature reviews conducted followed by the result analysis, we could get a clear understanding that a qualitative case study on this would be highly useful and necessary. This would aid and guide future research in this area.

With the insights from the literature explorations, the task of user experience collection of three Malayalam mobile applications, Aarogya Setu, BevQ and Manglish was done. First was a study of the user experience collection of two Malayalam mobile apps, Aarogya Setu and BevQ. Second was a study of the user experience collection of two Malayalam mobile apps, Aarogya Setu and Manglish (NIC for Government of Kerala, 2021b [online] https://www.kerala.gov.in by NIC for Government of Kerala, access date 15/08/2021, AarogyaSetu [online] https://www.aarogyasetu.gov.in by NIC for Government of India, access date 15/09/2021, NIC, 2021 [online] https://bevco.in/ Kerala State Beverages Corporation Ltd., Govt. Of Kerala, access date 15/09/2021, Clusterdev, 2021 [online] http://manglish.app/onlineby clusterdev, access date 15/08/2021).

3.4 About the applications used for user experience collection

Aarogya Setu is an application that is used in around twelve Indian languages including Malayalam. Our focus was on the Malayalam Aarogya Setu application – BevQ is a mobile application for token booking in the virtual queue of Beverages Corporation. The Manglish mobile application is used to key-in Malayalam words in English and get the Malayalam language notation equivalent of that. It is used widely with social media applications. The two studies were conducted by preparing a questionnaire, distributing it to the crowd and collecting feedback from Crowdmembers (the “crowd”). A comparison of crowdsourcing feedback was done between the two studies and conclusions were arrived at on what needs to be considered as the most influential factors in designing applications and interfaces for user experience collection of Malayalam mobile applications. What the Crowd expects from the service/work requester’s side was also uncovered. This can be considered a factor in attracting the crowd.

Details of the three applications used for the study are as given in Table 7 below.

The first study was conducted with Aarogya Setu and BevQ applications. A questionnaire with mixed question types – yes/no, choice and think-and-answer by selfwere distributed using the survey website of SurveySparrow. Many people viewed the questionnaire, but only very few attempted it and even fewer completed the task of answering the questionnaire completely.

3.5 Implementing the study using AarogyaSetu and BevQ (study I)

A Questionnaire was prepared after studying similar questionnaires for the purpose (Roy and Ganguli, 2008; Hao et al., 2016; Díaz-Oreiro et al., 2019) and this questionnaire was distributed to the Crowd. The Crowd we used here was immediate friends and friend groups who could be possible contributors. We requested them to pass it on to their known people who could be potential contributors. Table 8 below consolidates the results of this study.

3.5.1 Observations from the study

A questionnaire was prepared with 16 questions. The questionnaire consisted of think and answer type of questions. Very few were choices. Study 1 was viewed by 127 people, but only 34 people attempted the questions. Out of these 34, only 9 completed answering all the questions, i.e. 26.47%. The average time taken by these 9 people to complete the questionnaire was 13 min and 12 s, obviously not appreciable for use by experts and genuine users of the applications. This statistic is presented in Plate 5 below.

Plates 6 and 7 depicts the graphical representation of different observations from the study applying Crowdsourcing to the User Experience collection of Malayalam mobile applications (Study 1). Possible reasons are also listed.

3.6 Implementing the study using AarogyaSetu and manglish (study II)

The methodology adopted was the same as that of study I, except for the change in strategy that there were no think-and-answer, text-type questions. The answers to all questions had to be just chosen from a list of choices. This decision was made based on the analysis of results from Study I. In Study II, many potential contributors viewed the questionnaire as compared to the study and more people attempted and completed the questionnaire. The consolidation of excerpts from the study is as in Table 9 and 10 below:

Plates 8–12 depicts the graphical representation of different observations from the study applying Crowdsourcing to the User Experience collection of Malayalam mobile applications (Study II). Possible reasons are also listed.

4. Discussion

Findings of Crowdsourced User Experience collection of Malayalam mobile applications AarogyaSetu and BevQ were conducted as the first study. From the observations of the study, improvements were made on questions, question types and other interactive items produced to the Crowd members for giving feedback. A second study was conducted by applying Crowdsourcing to the User Experience collection of the Malayalam mobile applications, AarogyaSetu and Manglish. A comparison of a few serious concerns between the two studies is presented in Table 11 below. All observed facts and possible reasons for the same are also listed thereof.

5. Conclusions

The first two literature reviews drilled down to the significance and necessity for a case study on the usage of Crowdsourcing in usability (in terms of User Experience) collection of mobile applications used by people in a specific cultural background. From the Literature reviews, an analysis of concerns and challenges uncovered this need. It also indicated that crowdsourcing can be used as an effective mechanism for collecting interested, skilled and experienced people’s evaluations of mobile application usage experience. From the two case studies conducted, many interesting conclusions were arrived at. Factors which attract the crowd were absent other than an obligation for a few. Many conclusions could be arrived at relating to how to design the questionnaire, how the questionnaire or the evaluation item could reach maximum Crowd, the necessity of keeping optimal control over the Crowdsourcing process, etc. where the most prominent ones. A few are listed below:

Platform used a hierarchical reach mechanism and Internet reach would have given more and hence quality-improved results. Design – Design the feedback mechanism in such a way that the user interface and choices are unambiguous and distinct.

The controls used in the interface shall also provide ease of use.

Rewards – People work either because of compulsion or motivation. To attract stakeholders or nonstakeholders external to the system, a rewarding system must be included. Identify the most influential factors. Control – With hierarchy levels, controls may go loose. There were many visitors, but very few attempted and even few completed in case of Study I. Schedule – Keep a process in place to make the flow systematic- plan milestones and deliverables..

6. Future work

When reward is involved/time is too short/anonymity is not maintained and the crowd is obliged to the requester, there is a greater possibility that the textual expression we receive regarding the User Experience will not be close to the truth. Emotions in a text can be an indication of the sanctity and dependability of User Experience collected using Crowdsourcing. One of the most important future directions in this research is adding credibility and value to the User Experience (data) collected by giving weight to assessing emotional correctness and dependability.

Figures

Plot of concerns addressed in the published literature on applying crowdsourcing to software engineering

Plate 1

Plot of concerns addressed in the published literature on applying crowdsourcing to software engineering

Plot of challenges observed in using crowdsourcing in software engineering

Plate 2

Plot of challenges observed in using crowdsourcing in software engineering

Plot of concerns addressed in using crowdsourcing for user experience collection of mobile applications

Plate 3

Plot of concerns addressed in using crowdsourcing for user experience collection of mobile applications

Plot of challenges faced in using crowdsourcing for user experience collection of mobile applications

Plate 4

Plot of challenges faced in using crowdsourcing for user experience collection of mobile applications

Summary of user’s ease – difficulty with the application

Plate 5

Summary of user’s ease – difficulty with the application

SurveySparrow plots of various aspects relating to study I – continued from the previous figure

Plate 6

SurveySparrow plots of various aspects relating to study I – continued from the previous figure

SurveySparrow plots of various aspects relating to study I – continued from previous figure

Plate 7

SurveySparrow plots of various aspects relating to study I – continued from previous figure

SurveySparrow plots of various aspects relating to study II

Plate 8

SurveySparrow plots of various aspects relating to study II

SurveySparrow plots of various aspects relating to study II – continued from the previous

Plate 9

SurveySparrow plots of various aspects relating to study II – continued from the previous

SurveySparrow plots of various aspects relating to study II – continued from the previous

Plate 10

SurveySparrow plots of various aspects relating to study II – continued from the previous

SurveySparrow plots of various aspects relating to study II – continued from the previous

Plate 11

SurveySparrow plots of various aspects relating to study II – continued from the previous

SurveySparrow plots of various aspects relating to study II – continued from the previous

Plate 12

SurveySparrow plots of various aspects relating to study II – continued from the previous

Crowdsourcing for requirements-related aspects of software projects

Sl
No.
TitleAuthorsYearAspect(s) coveredChallenges mentioned
1A systematic mapping study on crowdsourced requirements engineering using user feedback (Wang et al., 2019)Chong Wang, Maya Daneva, Marten van Sinderen. Peng Liang2019Studies the types of user feedback and their up-to-date usage in Requirements Engineering activitiesJuser feedback was useful for RE purposes significance of explicit and implicit feedback in
Requirements Elicitation, Analysis, Specification, Validation, Management
2CREeLS: Crowdsourcing Requirements Elicitation Systems (Rizk et al., 2019)based|Nancy M. Rizk, Mervat for H. Gheith Ahmed M. Zaki, Eman S. Nast2019Crowdsourcing-based Requirements Elicitation for cleaning System (CREeLS)Helps in getting new ideas for requirements evolution, increase the quality of requirements elicitation, coverage of all the requirements, communication and collaboration between the stakeholders
3Crowdsourcing for Requirements Engineering: A Simplified Review (Ahmad et al., 2018)Sabrina Ahmad, Nurul Atikah Rosmadi, Sharifah Sakinah Syed Ahmad and Siti
Azirah Asmai
2018Crowdsourcing for Requirements EngineeringEnsuring necessity and completeness Accessing a large number of audiences
4An Overview of Crowdsourcing concepts in Software Engineering (Sari and Alptekin, 2017)ASLI SARI, GÜLFEM)2017 ISIKLAR ALPTEKIN Discussion of: Definition Challenges Pricing TheoryWorkflows security, privacy and law enforcement, poorly performing workers and proposing the right problem to the crowd Juicing mobile devices for crowdsourcing and quality of outcome, task design, reward mechanism, privacy and security threats, high quality contributions and design of appropriate platforms

Source(s): Author’s own work

Crowdsourcing for requirements-related aspects of software projects – continued from Table 1

Sl
No.
TitleAuthorsYearAspect(s) coveredChallenges mentioned
5Crowdsourcing
Software development concept. Benefits and adoption (Asiegbu Baldwin et al., 2017)
Asiegbu Baldwin Oluigbo Ikenna V. Ajakwe Simeon O., Onyike Gerald O2017Compares traditional outsourcing and crowdsourcingFollowing elements of
Crowdsourcing, Critical success factor model
6Crowdsourcing for Software Engineering (Stol. Klaas-Jan et al. 2017)Klaas-Jar Stol. Lero Thomas D. LaToza Christian Bird2017Discusses Taxonomy of Crowdsourcing tasks: rating, creation, processing, problem- solvingNot specifically mentioned
7Crowdsourcing Software development - Many benefits have many concerns (Hasteer Nitasha. et al 2016)Nitasha Hasteer Noshiba Nazir Abhay Bansal B K Murthy2016Case study of three crowdsourcing platformsCost Schedule Quality
8Dynamics of Software development Crowdsourcing (Dubey et al., 2016)Alpana Dubey. Kumar Abhinav Sakshi Taneja Gurdeep Virdi. Anurag Dwarakanath, Alex Kass, Mani Suma Kuriakose2016Studies predictability in task completion concerning two crowdsourcing platforms. TopCoder and Upwork Also make a study on Dynamics of Software development Crowdsourcing Platforms and what feature support they offerNot specifically mentioned

Source(s): Author’s own work

Crowdsourcing for requirements-related aspects of software projects – continued from Table 2

Sl
No.
TitleAuthorsYearAspect(s) coveredChallenges mentioned
9Software Crowdsourcing Challenges in the Brazilian IT Industry (Machado et al., 2016)Leticia Machado Josiane Kroll, Rafael Prikladnicki. Cleidson R. B. de Souza and Erran Carmel2016Study to identify challenges by interviewing
20 experts in Crowdsourcing in Brazil

Tasks lack of quality processes lack of CS processes people cultural barriers
10Configuring Crowdsourcing for Requirements elicitation (Hosseini et al., 2015)Mahmood Hosseini, Alimohammad Shahri, Keith Phalp. Jacqui Taylor. Raian Ali Fabiano Dalpiaz2015Covers crowdsourcing for requirements elicitation and investigates ways to configure crowdsourcing to improve the quality of elicited requirements,
Configuration of Crowdsourcing

Set of challenges in CSRE: Challenges related to largeness, anonymity. diversity. competence, collaboration intrinsic motivations, volunteering, extrinsic incentives, opt-outo opportunity, feedback
11Crowdsourcing Software Requirements and Development: A mechanism-based exploration of “Opensourcing" (Naparat et al., 2013)Damrongsak Naparat Patrick Finnegan2013Crowdsourcing Software Requirements and development. Open sourcing
in determining requirements identifying bugs, providing user-to-user support, and aiding software coding. Prepositions to overcome the challenges
Motivation, coordination effective communication filtering, integration and nurturing
12CrowdREquire:
A Requirements Engineering Crowdsourcing platform (Adepetu et al., 2012)
Adedamola, Khaja Altaf Ahmed
Yousif Al Abd. Aaesha Al Zaabi and Davor Svetinovic
2012Discusses how
CrowdRequire platform can be used for applying crowdsourcing to Requirements Engineering A system for functional and nonfunctional requirements specification
Dividing into subtasks
Result assessment and rewarding

Source(s): Author’s own work

Crowdsourced user experience collection of mobile applications

Sl
No.
TitleAuthorsYearAspect(s) coveredChallenges mentioned
1Conversational crowdsourcing made easy (Qiu et al., 2020)Qiu2020To avoid boredom and fatigue associated with crowdsourcing, the authors introduce conversational crowdsourcing systems which are more interactive
A conversational agent is involved. This improved user satisfaction and involvement
Crowdsourcing affects worker satisfaction and performance, challenges of data supply, and worker engagement
2The design of a mobile application for Crowdsourcing in Disaster Risk Reduction (Nguyen et al., 2019)Quynh Nhu Nguyen Antonella Frisiello Claudio Rossi2019Focuses on Crowdsourcing for collecting the crowd’s feedback for the development of a highly response-critical mobile application. An online survey was used to know the potential user’s expectationsSystematize
clean, sort and filter unstructured and unreliable information flow associated with crowdsourcing
3Crowdsourcing interface feature design with Bayesian optimization (Dudley et al., 2019)Dudley, John J., Jason T. Jacques, and Per Ola Kristensson2019Optimizing interface feature design using Bayesian optimization, implemented using crowdsourcingMobile VR app interface design challenge of gaze cueing
4Toward Crowdsourced User Studies for Software evaluation (Daniel et al., 2016)Florian Daniel, Pavel Kucherbaev2016A study on designing effective tasks for collecting user experience via crowdsourcingHow crowdsourced studies can be conducted without compromising the benefits offered by in-lab studies

Source(s): Author’s own work

Crowdsourced user experience collection of mobile applications – continued from Table 4

Sl
No.
TitleAuthorsYearAspect(s) coveredChallenges mentioned
5Apparition: Crowdsourced user interfaces that come to life as you sketch them (Lasecki et al., 2015)Lasecki, Walter S. et al2015Uses individuals in a crowd to prepare prototypes based on narrations they listen to. Crowdworkers refine the prototypes of interfaces built by/for users based on their narrations of the same. The most suitable prototype shall evolve into the systemThe generic challenges were
- managing parallel editing of the same interface
-avoiding repetitive work and production blocking
6
Affective Crowdsourcing applied to sabilitysting (Gomide et al., 2014)
Victor H. M. Gomide et al2014Two factors are studied
Applying usability tests remotely (crowd)detecting outliers based on user’s emotional behavior
concluded that effective crowdsourcing was very useful
Getting relevance judgments when applying to crowdsource
7Crowdsourcing towards User Experience evaluation: An intelligent user experience questionnaire (IUEQ) (Medin et al., 2014)Meedin, GS Nadeera, and Indika Perera2014-LR on challenges and measures to overcome these challenges on platforms for collaboration for
User interface design-discussion on the use of crowdsourcing for UI evaluation, based on user experience
Nothing explicitly specified

Source(s): Author’s own work

Crowdsourced user experience collection of mobile applications – continued from Table 5

Sl
No.
TitleAuthorsYearAspect(s) coveredChallenges mentioned
8Crowdsourcing User Interface Adaptations for minimizing the bloat in Enterprise Applications (Akiki et al., 2013)Akiki, P. A., A. K
Bandara, and Y. Yu
2013Reducing the visual complexity of software blotted with numerous features, with the help of crowdsourcing. UI adaptations of the software’s UI are built from crowd feedbackNothing was specifically mentioned
9Crowdsourcing performance evaluations of user interfaces (Komarov et al., 2013)Komarov, Steven, Katharina Reinecke, and Krzysztof Z. Gajos2013Studied the feasibility of conducting an online crowdsourced performance evaluation of UIs using Mturk paid crowd
Crowdsourcing was found to be an equally effective and better option in terms of resource requirements and other overheads
- Challenges in implementing crowdsourced operations
-Identifying the fraction of participants who are extreme outliers
-Challenges relating to the environment of the crowd member and other affective factors specific to the crowd member
10Crowdsourcing for usability testing (Liu et al., 2012)Liu, Di et al2012Evaluates the potential of crowdsourced usability testing using two case studies – one in-lab and the other, crowdsourcedUser involvement, controlling what the crowd tests
deriving useful feedback from answers

Source(s): Author’s own work

About Arogya Setu, BEVQ (Beverages Queue) and Manglish applications

ConcernAarogya SetuBevQManglish – The Malayalam keyboard
Language (s)12 languagesMalayalam, EnglishMalayalam, English
Purposespread awareness of COVID–19, and connect essential COVID–19-related
health services to the people of India
a queue management mobile applicationType Malayalam-like English and get it auto-converted to Malayalam Text (transliteration and speech-to-text)
ReachabilityDifferent states in IndiaFor virtual queue management at Beverages outlet inside Kerala StateAcross the globe to type/dictate Malayalam and get it as digital Malayalam text
OwnershipNational Informatics Centre, Govt. Of Indiadeveloped by Faircode Technologies of Kochi, Kerala
Made for launch by the Kerala State Beverages Corporation, Under the Govt
of Kerala
The minimal version is free. Premium version needs to be purchased Provided by Clusterdev
Initial release dateApril 2020May 2020September 2015
Present statusWorkingWithdrawnWorking
MobileApple
Android mobiles and more
Android mobiles ApplesApple
Android mobiles and more
Operating SystemsAndroid iOSAndroid 4.1 and up iOSAndroid
Size3.3 MB (Android)
13.3 MB (IOS)
9.7Mb (Android) 27Mb (iOS)28 MB (Android)
Written inKotlin and JavaReact Native, NodeJsJava
Availabilitysmartphones, the Aarogya Setu app uses Bluetooth and GPS technology
on non-smart phones, it works by cellular triangulation of the phone
Uses GPSOnce downloaded, can be used offline also

Source(s): Author’s own work

Results of the study using crowdsourcing for UX (User Experience) collection of AarogyaSetu and BevQ malayalam mobile applications

1Concerns and Instruction34
2Using Smartphone3332 YES
1 NO (3%)
(97%)
3Age group2818–286(21%)
29–393(11%)
40–50 −16(57%)
51–611(4%)
>61–2 (7%)
4Gender27Males – Females – 5 (19%)22(81%)
5Type of Job26Academia7(27%)
Industry12(46%)
Self-employed5 (19%)
Not employed - 2 (8%)
6BevQ Operational Experience26Yes –
No – 14 (56%)
11(44%)
7Aarogya ExperienceSetuOperational25Yes –
No – 9 (36%)
16(64%)
8Knowledge of App installation25Level 10
NONE
and use Level 2 4
(16%)
Level 32(8%)
Level 48(32%)
Level 5–11 (44%)
9Are both Apps on the same phone?13Yes –
No – 8 (62%)
5(38%)
10Details of SmartPhone6 Had to key-in
11Do both Apps run on the same Internet connectivity?11
J
Yes –
No – 4 (36%)
7(64%)
12Details of Internet connectivity2 Had to key-in
13Overall performance of BevQ8Level 33(38%)10 – point rating scale
Level 41(13%)
Level 51(13%)
Level 61(13%)
Level 71(13%)
Level 10–1 (13%)
14Overall performance of8Level 41(13%)10 – point rating scale
Aarogya Setu Level 71(13%)
Level 82(25%)
Level 91(13%)
Level 10–3 (38%)

Source(s): Author’s own work

Results of Study II - Aarogya Setu and Manglish Malayalam mobile applications

Sl. No.ConcernTotal answeredResultRemark
1Age4718–28 – 35 (74%)
29–39 – 4 (9%)
40–50 – 5 (11%)
51–60 – 3 (6%)
>61 – NONE
2Gender47Male – 26 (55%)
Female – 21 (45%)
3Smartphone use in years47Not yet – 1 (2%)
<3–2 (4%)
3–8 – 30 (64%)
9–14 – 11 (23)
15–20 – 2 (4%)
>20–1 (2%)
4Ability to install and use mobile apps471–1 (2%)
3–6 (13%)
4–18 (38%)
5–22 (47%)
1 - lowest
5No of Applications in your mobile47<10–8 (17%)
10–20 – 12(26%)
21–30 – 13 (28%)
>30–14 (30%)
6No of Applications in used per day47<5–13 (28%)
5–10 – 28 (60%)
11–15 – 6 (13%)
>15 – NONE
7Internet connectivity in your mobile474–3 (6%)
6–6 (13%)
7–7 (15%)
8–14 (30%)
9–8 (17%)
10–9 (19%)

Source(s): Author’s own work

Results of study II - Aarogya Setu and Manglish Malayalam mobile applications – continued from Table 9

Sl. No.ConcernTotal answeredResultRemark
8Highest Educational Qualification47+2/Equivalent – 4 (9%)
Degree/Graduate – 8 (17%)
Post Graduate – 33 (70%)
PhD – 2 (4%)
9Level of IT awareness473–11 (23%)
4–21 (45%)
5–15 (32%)
10Prior experience in Using Manglish47YES – 12 (26%)
NO – 35 (74%)
11Prior experience in Using Aarogya Setu47YES – 27 (57%)
NO – 20 (43%)
12Ready to help us?47YES – 32 (68%)
NO - 15 (32%)
13Aarogya Setu Rating (first time use)32Feature and ratings (different factors)
14Manglish Rating (first time use)32Feature and ratings (different factors)
15Chances of referring to a friend320–1 (3%)
3–1 (3%)
5–2 (6%)
6–4 (13%)
7–6 (19%)
8–7 (22%)
9–9 (28%)
10–2 (6%)

Source(s): Author’s own work

Observations from study I and study II

Sl. NoConcernStudy IStudy IIRemark
1Gender bias in attempting the questionnaireEvidentLeveled1) BevQ (App)
2) Descriptive questions
2Age group responseMiddle AgedYoungsters1) No BevQ in study II. It may be a cultural/societal barrier
2) Objective questions only in study II
3Total visits1272791) Many are keen, but not helpful. Mentality may be a reason
2) The time given was only 2 days for both questionnaires
4Attempted3447Attempted – 34/127 = 26.77%; 47/279 = 16.85%
Completed – 9/34 = 26.47%; 47/47 = 100%
5Completed9471) More youngsters in Study II. Mindset
2) Objective questions
3) Knowledge of apps and functional usage is present and technical know-how is less
6Average finishing time13 min 12 s2 min 36 sIndicates that the crowd prefers and likes it when choices are given. Ease of use, ease of understanding

Source(s): Author’s own work

References

Adepetu, A., Khaja, A.A., Al Abd, Y., Al Zaabi, A. and Svetinovic, D. (2012), “Crowdrequire: a requirements engineering crowdsourcing platform”, 2012 AAAI Spring Symposium Series.

Ahmad, S., Rosmadi, N.A., Syed Ahmad, S.S. and Asmai, S.A. (2018), “Crowdsourcing for requirements engineering: a simplified review”, International Journal of Computer Information Systems and Industrial Management Applications, Vol. 10, pp. 134-142.

Akiki, P., Bandara, A. and Yu, Y. (2013), “Crowdsourcing user interface adaptations for minimizing the bloat in enterprise applications”, Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems.

Asiegbu Baldwin, C., Oluigbo Ikenna, V., Ajakwe Simeon, O. and Onyike Gerald, O. (2017), “Crowdsourcing software development: concept, benefits, and adoption”, International Journal of Scientific Research in Computer Science and Engineering, Vol. 5 No. 3, pp. 7-16.

Cappa, F., Federica, R. and Hayes, D. (2019), “Monetary and social rewards for crowdsourcing”, Sustainability, Vol. 11 No. 10, p. 2834.

Chandler, J. and Mueller, P. (2013), Michelucci, P. (Ed.), Handbook of Human Computation, pp. 377-392.

Clusterdev (2021), “About Manglish”, available at: http://manglish.app/online (accessed 15 August 2021).

Daniel, F. and Pavel, K. (2016), “Toward crowdsourced user studies for software evaluation”, arXiv preprint, arXiv:1609.01070.

Díaz-Oreiro, I., López, G., Quesada, L. and Guerrero, L.A. (2019), “Standardized questionnaires for user experience evaluation: a systematic literature review”, Multidisciplinary Digital Publishing Institute Proceedings, Vol. 31 No. 1, p. 14.

Dubey, A., Abhinav, K., Taneja, S., Virdi, G., Dwarakanath, A., Kass, A. and Kuriakose, M.S. (2016), “Dynamics of software development crowdsourcing”, 2016 IEEE 11th International Conference on Global Software Engineering (ICGSE), IEEE.

Dudley, J.J., Jacques, J.T. and Kristensson, P.O. (2019), “Crowdsourcing interface feature design with bayesian optimization”, Proceedings of the 2019 chi conference on human factors in computing systems.

Estellés-Arolas, E., Navarro-Giner, R. and González-Ladrón-de-Guevara, F. (2015), “Crowdsourcing fundamentals: definition and typology”, Advances in Crowdsourcing, Springer, Cham, pp. 33-48.

Goh, D.H.-L., Pe-Than, E.P.P. and Lee, C.S. (2017), “Perceptions of virtual reward systems in crowdsourcing games”, Computers in Human Behavior, Vol. 70, pp. 365-374.

Gomide, V.H.M., Valle, P.A., Ferreira, J.O., Barbosa, J.R., Da Rocha, A.F. and Barbosa, T. (2014), “Affective crowdsourcing applied to usability testing”, International Journal of Computer Science and Information Technologies, Vol. 5 No. 1, pp. 575-579.

Hao, Y., Chong, W., Man, K.L., Liu, O. and Shi, X. (2016), “Key factors affecting user experience of mobile crowdsourcing applications”, Proceedings of the International Multi Conference of Engineers and Computer Scientists, Vol. 2.

Hasteer, N., Nazir, N., Bansal, A. and Murthy, B.K. (2016), “Crowdsourcing software development: many benefits many concerns”, Procedia Computer Science, Vol. 78, pp. 48-54.

Hosseini and Mahmoud (2014), “Crowdsourcing definitions and its features: an academic technical report”, RCIS 2014 Conference.

Hosseini, M., Shahri, A., Phalp, K., Taylor, J., Ali, R. and Dalpiaz, F. (2015), “Configuring crowdsourcing for requirements elicitation”, 2015 IEEE 9th International Conference on Research Challenges in Information Science (RCIS), IEEE.

Khan, H.H., Malik, M.N., Alotaibi, Y., Alsufyani, A. and Alghamdi, S.A. (2021), “Crowdsourced requirements engineering challenges and solutions: a software industry perspective”, Computer Systems Science and Engineering, Vol. 39 No. 2, pp. 221-236.

Kietzmann and Jan, H. (2017), “Crowdsourcing: a revised definition and introduction to new research”, Business Horizons, Vol. 60 No. 2, pp. 151-153.

Komarov, S., Reinecke, K. and Gajos, K.Z. (2013), “Crowdsourcing performance evaluations of user interfaces”, Proceedings of the SIGCHI conference on human factors in computing systems.

Lasecki, W.S., Kim, J., Rafter, N., Sen, O., Bigham, J.P. and Bernstein, M.S. (2015), “Apparition: crowdsourced user interfaces that come to life as you sketch them”, Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems.

LaToza, T.D. and Van Der Hoek, A. (2015), “Crowdsourcing in software engineering: models, motivations, and challenges”, IEEE Software, Vol. 33 No. 1, pp. 74-80.

Liu, D., Bias, R.G., Lease, M. and Kuipers, R. (2012), “Crowdsourcing for usability testing (2012)”, Proceedings of the American Society for Information Science and Technology, Vol. 49 No. 1, pp. 1-10.

Machado, L., Kroll, J., Marczak, S., Prikladnicki, R. (2016), “Software crowdsourcing challenges in the Brazilian IT Industry”, International Conference on Enterprise Information Systems, Italia, 2016.

Meedin, G.S.N. and Perera, I. (2014), “Crowdsourcing towards User Experience evaluation: an intelligent user experience questionnaire (IUEQ)”, 14th International Conference on Advances in ICT for Emerging Regions (ICTer), IEEE.

MonkeyLearn Team (2021), “About MonkeyLearn”, available at: https://monkeylearn.com/word-cloud/ (accessed 15 September 2021).

Naparat, D. and Finnegan, P. (2013), “Crowdsourcing software requirements and development: a mechanism-based exploration of ‘opensourcing’”, Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, August 14-17, 2013.

Nguyen, Q.N., Frisiello, A. and Rossi, C. (2019), “The design of a mobile application for crowdsourcing in disaster risk reduction”, ISCRAM.

NIC for Government of India (2021a), “About AarogyaSetu”, available at: https://www.aarogyasetu.gov.in (accessed 15 September 2021).

NIC for Government of Kerala (2021b), “About Government of Kerala”, available at: https://www.kerala.gov.in (accessed 15 August 2021).

NIC for Kerala State Beverages Corporation, Government of Kerala (2021), “About Beverages Corporation”, available at: https://bevco.in/ (accessed 15 September 2021).

Qiu, S., Gadiraju, U. and Bozzon, A. (2020), “Ticktalkturk: conversational crowdsourcing made easy”, Conference Companion Publication of the 2020 on Computer Supported Cooperative Work and Social Computing.

Rizk, N.M., Gheith, M.H., Zaki, A.M. and Nasr, E.S. (2019), “CREeLS: crowdsourcing based requirements elicitation for eLearning Systems”, International Journal of Advanced Computer Science and Applications, Vol. 10, p. 10.

Roy, S.K. and Ganguli, S.H.I.R.S.H.E.N.D.U. (2008), “Service quality and customer satisfaction: an empirical investigation in Indian mobile Telecommunications services”, Marketing Management Journal, Vol. 18 No. 2, pp. 119-144.

Sari, A. and Alptekin, G.I. (2017), “An overview of crowdsourcing concepts in software engineering”, International Journal of Computers, Vol. 2, pp. 1-9.

Stol, K.-J., LaToza, T.D. and Bird, C. (2017), “Crowdsourcing for software engineering”, IEEE Software, Vol. 34 No. 2, pp. 30-36.

Wang, C., Daneva, M., van Sinderen, M. and Liang, P. (2019), “A systematic mapping study on crowdsourced requirements engineering using user feedback”, Journal of Software: Evolution and Process, Vol. 31 No. 10, e2199.

Corresponding author

Malathi Sivasankara Pillai is the corresponding author and can be contacted at: nair.malu@gmail.com

About the authors

Malathi Sivasankara Pillai is an Assistant Professor in the Department of Computer Applications, Cochin University of Science and Technology, Kochi, Kerala, India. She is an Master of Computer Applications (MCA) and an MTech in Software Engineering from reputed Universities. Her areas of research interest are software engineering, software quality assurance, software project management, education and speech, and audio processing.

Dr Kannan Balakrishnan is an Emeritus Professor in the Department of Computer Applications, Cochin University of Science and Technology, Kochi, Kerala, India. He holds MSc in Mathematics, MPhil, MTech in Computer Science and PhD from reputed Universities. He has produced many PhDs under his guideship and has many international publications to his credit. He is also a journal peer review team member and a book author. Dr Kannan’s areas of research interest include but are not limited to graph networks, graph networks in computer applications, artificial intelligence, neural networks, deep learning and intelligent computing.

Related articles