What top 3 criteria would you recommend someone use to evaluate an app they are thinking of using?

As a student, you will be gathering information from a variety of types of sources for your research projects including books, newspaper articles, magazine articles, specialized databases, and websites. As you examine each source, it is important to evaluate each source to determine the quality of the information provided within it. Common evaluation criteria include: purpose and intended audience, authority and credibility, accuracy and reliability, currency and timeliness, and objectivity or bias. Each of these criteria will be explained in more detail below.

Purpose and intended audience

  • What is the purpose of the source? For example:
    • To provide information (e.g., newspaper articles)
    • To persuade or advocate (e.g., editorials or opinion pieces)
    • To entertain (e.g., a viral video)
    • To sell a product or service (e.g., advertising or marketing materials on a company website)
  • Who is the intended audience? For example:
    • Scholars and academic researchers with specialized knowledge
    • The general public (without specialized knowledge)
    • Students in high school, college or university (e.g., textbooks for students learning a new subject).

Authority and credibility

  • Who is the author?
    • Is it a person?
    • Is it an organization such as a government agency, nonprofit organization, or a corporation?
  • What are the qualifications of the author?
    • What is the author's occupation, experience, or educational background?
    • Does the author have any subject matter expertise?
    • Is the author affiliated with an organization such as a university, government agency, nonprofit organization, or a corporation?
  • Who is the publisher?
    • For books, is it a university press or a commercial publisher? These types of publishers use editors in order to ensure a quality publication.
    • For journals or magazines, can you tell if it is popular or scholarly in nature? See: Peer-reviewed, popular magazine, or journal?
    • For websites, is it an organizational website, or a personal blog?

Accuracy and reliability

  • Is the information well researched?
    • Are there references (e.g., citations, footnotes, or a bibliography) to sources that will provide evidence for the claims made?
    • If the source includes facts or statistical data, can this information be verified in another source?
    • If the data was gathered using original research (such as polling or surveys), what was the method of data collection? Has the author disclosed the validity or reliability of the data?

Currency and timeliness

  • When was the information published?
    • For books and articles - you should be able to easily verify the publication date.
    • For websites, try to determine the date the web page was created or updated
  • Is current information required? If not, then accurate, yet historical, information may still be acceptable.

Objectivity or bias

  • Does the source contain opinions or facts?
  • Is the information presented in the source objective (unbiased) or subjective (biased)?
  • Does the information promote a political, religious, or social agenda?
  • Is advertising content (usually found in business magazines or newspapers) clearly labelled?

In Summary

  • Does the source provide you with high-quality information? Is the information useful in answering your questions and meeting your information need?

Adapted from Burkhardt, J.M & MacDonald, M.C. (2010). Teaching information Literacy: 50 standards-based exercises for college students.Chicago: American Library Association.

1. Riley WT, Rivera DE, Atienza AA, Nilsen W, Allison SM, Mermelstein R. Health behavior models in the age of mobile interventions: Are our theories up to the task? Transl Behav Med. 2011 Mar;1(1):53–71. doi: 10.1007/s13142-011-0021-7. http://europepmc.org/abstract/MED/21796270. [PMC free article] [PubMed] [CrossRef] [Google Scholar]

2. Cisco & Co. Cisco. Cisco visual networking index: Cisco & Co; 2014. [2014-02-26]. webcite Cisco visual networking index: Global mobile data traffic forecast update, 2013–2018 http://www.cisco.com/c/en/us/solutions/collateral/service-provider/visual-networking-index-vni/white_paper_c11-520862.html. [Google Scholar]

3. Daum A. Canalys. 2013. [2014-02-27]. webcite 11% quarterly growth in downloads for leading app stores http://www.canalys.com/newsroom/11-quarterly-growth-downloads-leading-app-stores.

4. Dredge S. The Guardian Online. 2013. [2014-02-27]. webcite Mobile apps revenues tipped to reach $26bn in 2013 http://www.theguardian.com/technology/appsblog/2013/sep/19/gartner-mobile-apps-revenues-report.

5. Cummings E, Borycki E, Roehrer E. Issues and considerations for healthcare consumers using mobile applications. Stud Health Technol Inform. 2013;183:227–231. [PubMed] [Google Scholar]

6. Kuehnhausen M, Frost VS. Trusting smartphone apps? To install or not to install, that is the question. Cognitive Methods in Situation Awareness and Decision Support; IEEE International Multi-Disciplinary Conference; 2013 Feb 25-28; San Diego, CA, USA. IEEE; 2013. pp. 30–37. [CrossRef] [Google Scholar]

7. Girardello A, Michahelles F. AppAware: Which mobile applications are hot?. 12th international conference on Human computer interaction with mobile devices and services; 2010 Sept; New York. NY: ACM; 2010. Sep, pp. 431–434. [CrossRef] [Google Scholar]

8. Seethamraju R. Measurement of user perceived web quality. 12th European Conference on Information Systems (ECIS); 2004 Jan; Turku, Finland. 2004. Jan, [Google Scholar]

9. Olsina L, Rossi G. Measuring Web application quality with WebQEM. IEEE Multimedia. 2002 Oct;9(4):20–29. doi: 10.1109/MMUL.2002.1041945. [CrossRef] [Google Scholar]

10. Aladwani AM, Palvia PC. Developing and validating an instrument for measuring user-perceived web quality. Information & Management. 2002 May;39(6):467–476. doi: 10.1016/S0378-7206(01)00113-6. [CrossRef] [Google Scholar]

11. Moustakis V, Litos C, Dalivigas A, Tsironis L. Website quality assessment criteria. 9th international conference of information quality; 2004 Nov 5-7; Boston, MA, USA. 2004. Nov, pp. 59–73. http://mitiq.mit.edu/ICIQ/Documents/IQ%20Conference%202004/Papers/WebsiteQualityAssessmentCriteria.pdf. [Google Scholar]

12. Kim P, Eng TR, Deering MJ, Maxfield A. Published criteria for evaluating health related web sites: Review. BMJ. 1999 Mar 6;318(7184):647–649. http://europepmc.org/abstract/MED/10066209. [PMC free article] [PubMed] [Google Scholar]

13. Handel MJ. mHealth (mobile health)-using Apps for health and wellness. Explore (NY) 2011;7(4):256–261. doi: 10.1016/j.explore.2011.04.011. [PubMed] [CrossRef] [Google Scholar]

14. Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. 2013 Jan;19(1):48–53. doi: 10.1089/tmj.2012.0073. [PubMed] [CrossRef] [Google Scholar]

15. Health Care Information Management Systems Society . mHIMSS App Usability Work Group. mHIMSS; 2012. [2014-03-04]. webcite Selecting a mobile app: Evaluating the usability of medical applications http://www.himss.org/files/HIMSSorg/content/files/SelectingMobileApp_EvaluatingUsabilityMedicalApplications.pdf. [Google Scholar]

16. Lewis Thomas Lorchan, Wyatt Jeremy C. mHealth and mobile medical Apps: A framework to assess risk and promote safer use. J Med Internet Res. 2014;16(9):e210. doi: 10.2196/jmir.3133. http://www.jmir.org/2014/9/e210/ [PMC free article] [PubMed] [CrossRef] [Google Scholar]

17. Usability.net. [2015-02-26]. webcite Usability sciences http://www.usabilitynet.org/home.htm.

18. Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Int J Surg. 2010;8(5):336–341. doi: 10.1016/j.ijsu.2010.02.007. http://linkinghub.elsevier.com/retrieve/pii/S1743-9191(10)00040-3. [PubMed] [CrossRef] [Google Scholar]

19. Randomization.com. [2014-03-03]. webcite http://randomization.com/

20. Zou GY. Sample size formulas for estimating intraclass correlation coefficients with precision and assurance. Stat Med. 2012 Dec 20;31(29):3972–3981. doi: 10.1002/sim.5466. [PubMed] [CrossRef] [Google Scholar]

21. Shrout PE, Fleiss JL. Intraclass correlations: Uses in assessing rater reliability. Psychol Bull. 1979 Mar;86(2):420–428. [PubMed] [Google Scholar]

22. Hallgren KA. Computing inter-rater reliability for observational data: An overview and tutorial. Tutor Quant Methods Psychol. 2012;8(1):23–34. http://europepmc.org/abstract/MED/22833776. [PMC free article] [PubMed] [Google Scholar]

23. Oppenheim AN. Questionnaire design, interviewing, and attitude measurement. London: Pinter; 1992. [Google Scholar]

24. JMIR Publications JMIR mhealth and uhealth (JMU) 2013. Mar 12, [2014-11-12]. webcite Apps peer-review launched http://mhealth.jmir.org/announcement/view/67.

25. Su W. A preliminary survey of knowledge discovery on smartphone applications (apps): Principles, techniques and research directions for e-health. Proceedings of the International Conference on Complex Medical Engineering; Proceedings of the International Conference on Complex Medical Engineering; 2014; Taipei, Taiwan. arXiv.org; 2014. http://www.researchgate.net/publication/264312757_A_Preliminary_Survey_of_Knowledge_Discovery_on_Smartphone_Applications_(apps)_Principles_Techniques_and_Research_Directions_for_E-health. [Google Scholar]


Page 2

PMC full text:

This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included.

Number of criteria for evaluation of mHealth app quality identified in the literature search.

Criterion categoryFrequency, N=349(%)
App classification, confidentiality, security, registration, community, affiliation12(3.4)
Aesthetics, graphics, layout, visual appeal52(14.8)
Engagement, entertainment, customization, interactivity, fit to target group, etc66(18.9)
Functionality, performance, navigation, gestural design, ease of use90(25.8)
Information, quality, quantity, visual information, credibility, goals, description113(32.4)
Subjective quality, worth recommending, stimulates repeat use, overall satisfaction rating16(4.6)